lightning.regression.FistaRegressor

class lightning.regression.FistaRegressor(C=1.0, alpha=1.0, penalty='l1', max_iter=100, max_steps=30, eta=2.0, sigma=1e-05, callback=None, verbose=0, prox_args=())[source]

Estimator for learning linear classifiers by FISTA.

The objective functions considered take the form

minimize F(W) = C * L(W) + alpha * R(W),

where L(W) is a loss term and R(W) is a penalty term.

Parameters:

penalty : str or Penalty object, {‘l2’, ‘l1’, ‘l1/l2’, ‘simplex’}

The penalty or constraint to be used.

  • l2: ridge
  • l1: lasso
  • l1/l2: group lasso
  • tv1d: 1-dimensional total variation (also known as fussed lasso)
  • simplex: simplex constraint

The method can also take an arbitrary Penalty object, i.e., an instance that implements methods projection regularization method (see file penalty.py)

C : float

Weight of the loss term.

alpha : float

Weight of the penalty term.

max_iter : int

Maximum number of iterations to perform.

max_steps : int

Maximum number of steps to use during the line search.

sigma : float

Constant used in the line search sufficient decrease condition.

eta : float

Decrease factor for line-search procedure. For example, eta=2. will decrease the step size by a factor of 2 at each iteration of the line-search routine.

callback : callable

Callback function.

verbose : int

Verbosity level.

Methods

fit(X, y)
get_params([deep]) Get parameters for this estimator.
n_nonzero([percentage])
predict(X)
score(X, y[, sample_weight]) Returns the coefficient of determination R^2 of the prediction.
set_params(**params) Set the parameters of this estimator.
__init__(C=1.0, alpha=1.0, penalty='l1', max_iter=100, max_steps=30, eta=2.0, sigma=1e-05, callback=None, verbose=0, prox_args=())[source]
get_params(deep=True)

Get parameters for this estimator.

Parameters:

deep: boolean, optional :

If True, will return the parameters for this estimator and contained subobjects that are estimators.

Returns:

params : mapping of string to any

Parameter names mapped to their values.

score(X, y, sample_weight=None)

Returns the coefficient of determination R^2 of the prediction.

The coefficient R^2 is defined as (1 - u/v), where u is the regression sum of squares ((y_true - y_pred) ** 2).sum() and v is the residual sum of squares ((y_true - y_true.mean()) ** 2).sum(). Best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0.

Parameters:

X : array-like, shape = (n_samples, n_features)

Test samples.

y : array-like, shape = (n_samples) or (n_samples, n_outputs)

True values for X.

sample_weight : array-like, shape = [n_samples], optional

Sample weights.

Returns:

score : float

R^2 of self.predict(X) wrt. y.

set_params(**params)

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Returns:self :