torchml.linear_model¶
Classes¶
torchml.linear_model.LinearRegression
¶
Description¶
Ordinary least-square model with bias term.
Solves the following optimization problem in closed form:
References¶
Arguments¶
fit_intercept
(bool) - Whether to fit a bias term.normalized
(str) - Normalizes scheme to use.copy_X
(bool) - Whether to copy the data X (else, it might be modified in-place).n_jobs
(int) - Dummy to match the scikit-learn API.positive
(bool) - Forces the coefficients to be positive when True (not implemented).
Example¶
linreg = LinearRegression(fit_intercept=False)
fit(self, X, y)
¶
torchml.linear_model.Ridge
¶
Description¶
Linear regression with L2 penalty term.
w
- weights of the linear regression with L2 penaltyX
- variatesλ
- constant that multiplies the L2 termy
- covariates
The above equation is the closed-form solution for ridge's objective function
References¶
- Arthur E. Hoerl and Robert W. Kennard's introduction to Ridge Regression paper
- Datacamp Lasso and Ridge Regression Tutorial tutorial
- The scikit-learn documentation page
Arguments¶
alpha
(float, default=1.0) - Constant that multiplies the L2 term. alpha must be a non-negative float.fit_intercept
(bool, default=False) - Whether or not to fit intercept in the model.normalize
(bool, default=False) - If True, the regressors X will be normalized. normalize will be deprecated in the future.copy_X
(bool, default=True) - If True, X will be copied.solver
(string, default='auto') - Different solvers or algorithms to use.
Example¶
ridge = Ridge()
fit(self, X: Tensor, y: Tensor)
¶
torchml.linear_model.Lasso
¶
Description¶
Linear regression with L1 penalty term.
m
- number of input samplesX
- variatesw
- weights of the linear regression with L1 penaltyb
- intercepty
- covariatesλ
- constant that multiplies the L1 term
Since lasso regression cannot derive into a closed-form equation, we used Cvxpylayers to construct a pytorch layer and directly compute the solution for the objective function above.
References¶
- Robert Tibshirani's introduction to Lasso Regression paper
- Datacamp Lasso and Ridge Regression Tutorial tutorial
- The scikit-learn documentation page
Arguments¶
alpha
(float, default=1.0) - Constant that multiplies the L1 term. alpha must be a non-negative float.fit_intercept
(bool, default=False) - Whether or not to fit intercept in the model.positive
(bool, default=False) - When set to True, forces the weights to be positive.require_grad
(bool, default=False) - When set to True, tensor's require_grad will set to be true (useful if gradients need to be computed).
Example¶
lasso = Lasso()