Skip to content

Linear Regression Models

Standard linear regression models including OLS, Ridge, Elastic Net, and specialized variants.

ols

Ordinary Least Squares regression.

ps.ols(
    y: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Linear Model Output

Example:

df.group_by("group").agg(ps.ols("y", "x1", "x2").alias("model"))


ridge

Ridge regression (L2 regularization).

ps.ridge(
    y: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    lambda_: float = 1.0,
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Linear Model Output


elastic_net

Elastic Net regression (L1 + L2 regularization).

ps.elastic_net(
    y: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    lambda_: float = 1.0,
    alpha: float = 0.5,  # L1 ratio (0 = Ridge, 1 = Lasso)
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Linear Model Output


wls

Weighted Least Squares regression.

ps.wls(
    y: Union[pl.Expr, str],
    weights: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Linear Model Output


rls

Recursive Least Squares regression (online learning).

ps.rls(
    y: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    forgetting_factor: float = 0.99,
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Linear Model Output


bls

Bounded Least Squares regression.

ps.bls(
    y: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    lower_bound: float | None = None,
    upper_bound: float | None = None,
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Linear Model Output


nnls

Non-negative Least Squares (shorthand for bls with lower_bound=0).

ps.nnls(
    y: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Linear Model Output


quantile

Quantile regression for estimating conditional quantiles (e.g., median).

ps.quantile(
    y: Union[pl.Expr, str],
    *x: Union[pl.Expr, str],
    tau: float = 0.5,            # Quantile to estimate (0.5 = median)
    with_intercept: bool = True,
) -> pl.Expr

Returns: See Quantile Regression Output


isotonic

Isotonic (monotonic) regression for calibration curves and monotone relationships.

ps.isotonic(
    y: Union[pl.Expr, str],
    x: Union[pl.Expr, str],
    increasing: bool = True,     # True = increasing, False = decreasing
) -> pl.Expr

Returns: See Isotonic Regression Output


See Also