A Julia package to benchmark optimization solvers on logistic regression problems.
- MIT license
- Install using
julia> ] add LogisticOptTools
Suppose you import LogisticOptTools in your REPL
julia> using LogisticOptTools julia> const LOT = LogisticOptTools
Suppose you have available a matrix of features
X and a vector of observations
and you want to fit a logistic model onto this data.
You could instantiate a new logistic model simply by typing
julia> logreg = LOT.LogisticRegressor(X, y, fit_intercept=true, penalty=LOT.L2Penalty(1.0))
and then fit the logistic regression with
julia> p = LOT.nfeatures(logreg) julia> x0 = zeros(p) ; algo = LBFGS() julia> res = Optim.optimize(logreg, x0, algo) # Fetch optimal parameters julia> p_opt = res.minimizer
LogisticOptTools could use the different algorithms implemented in Optim.jl.
We depict in the following figure a comparison of three algorithms, when
fitting a logistic model on the
covtype dataset (581,012 data, 54 features).
For an example on how to use other solvers, we have implemented
examples/tron.jl a resolution of a logistic regression problem
tron, a solver implemented JSOSolvers.jl.
Import LIBSVM datasets
LogisticOptTools supports the
libsvm format. Once a dataset downloaded
from the website,
you could load it in the Julia REPL with
shell> ls . covtype.binary.bz2 # Parse as Float64 julia> dataset = LOT.parse_libsvm("covtype.binary.bz2", Float64) # Load as dense matrix julia> X = LOT.to_dense(dataset) julia> y = dataset.labels
You could load the dataset as a sparse matrix just by replacing
You could find in
examples/ a few examples on:
- optimizing the L2 penalty parameter with
- fitting a sparse regression (l0-l2 logistic regression) with JuMP and a MILP solver