regressor
¶
Classes:
Name | Description |
---|---|
Regressor |
Incremental wrapper for PyTorch regression models. |
Regressor
¶
Regressor(
module: Module,
loss_fn: Union[str, Callable],
optimizer_fn: Union[str, Type[Optimizer]],
lr: float = 0.001,
is_feature_incremental: bool = False,
device: str = "cpu",
seed: int = 42,
**kwargs
)
Bases: DeepEstimator
, MiniBatchRegressor
Incremental wrapper for PyTorch regression models.
Provides feature-incremental learning (optional) by expanding the first
trainable layer on-the-fly when unseen feature names are encountered.
Suitable for streaming / online regression tasks using the :mod:river
API.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
module
|
Module
|
PyTorch module that outputs a numeric prediction (shape (N, 1) or (N,)). |
required |
loss_fn
|
str | Callable
|
Loss identifier or callable (e.g. |
required |
optimizer_fn
|
str | Type[Optimizer]
|
Optimizer spec ( |
required |
lr
|
float
|
Learning rate. |
1e-3
|
is_feature_incremental
|
bool
|
If True, expands the input layer for new feature names. |
False
|
device
|
str
|
Torch device. |
'cpu'
|
seed
|
int
|
Random seed for reproducibility. |
42
|
**kwargs
|
Extra args stored for cloning/persistence. |
{}
|
Examples:
Real-world streaming regression on the Bikes dataset from :mod:`river`.
We retain only numeric features (discarding timestamps/strings) to build
dense tensors. We maintain an online MAE; the exact value may vary depending
on library version and hardware.
>>> import random, numpy as np
>>> import torch
>>> from torch import nn, manual_seed
>>> from river import datasets, metrics
>>> from deep_river.regression import Regressor
>>> _ = manual_seed(42); random.seed(42); np.random.seed(42)
>>> first_x, _ = next(iter(datasets.Bikes()))
>>> numeric_keys = sorted([k for k, v in first_x.items() if isinstance(v, (int, float))])
>>> class SmallNet(nn.Module):
... def __init__(self, n_features):
... super().__init__()
... self.net = nn.Sequential(
... nn.Linear(n_features, 8),
... nn.ReLU(),
... nn.Linear(8, 1)
... )
... def forward(self, x):
... return self.net(x)
>>> model = Regressor(module=SmallNet(len(numeric_keys)), loss_fn='mse',
... optimizer_fn='sgd', lr=1e-2)
>>> mae = metrics.MAE()
>>> for i, (x, y) in enumerate(datasets.Bikes().take(200)):
... x_num = {k: x[k] for k in numeric_keys}
... y_pred = model.predict_one(x_num)
... model.learn_one(x_num, y)
... mae.update(y, y_pred)
>>> print(f"MAE: {mae.get():.4f}")
MAE: ...
Methods:
Name | Description |
---|---|
clone |
Return a fresh estimator instance with (optionally) copied state. |
draw |
Render a (partial) computational graph of the wrapped model. |
load |
Load a previously saved estimator. |
predict_many |
Predict target values for multiple instances (returns single-column DataFrame). |
predict_one |
Predict target value for a single instance. |
save |
Persist the estimator (architecture, weights, optimiser & runtime state). |
Source code in deep_river/regression/regressor.py
clone
¶
Return a fresh estimator instance with (optionally) copied state.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
new_params
|
dict | None
|
Parameter overrides for the cloned instance. |
None
|
include_attributes
|
bool
|
If True, runtime state (observed features, buffers) is also copied. |
False
|
copy_weights
|
bool
|
If True, model weights are copied (otherwise the module is re‑initialised). |
False
|
Source code in deep_river/base.py
draw
¶
Render a (partial) computational graph of the wrapped model.
Imports graphviz
and torchviz
lazily. Raises an informative
ImportError if the optional dependencies are not installed.
Source code in deep_river/base.py
load
classmethod
¶
Load a previously saved estimator.
The method reconstructs the estimator class, its wrapped module, optimiser state and runtime information (feature names, buffers, etc.).
Source code in deep_river/base.py
predict_many
¶
Predict target values for multiple instances (returns single-column DataFrame).
Source code in deep_river/regression/regressor.py
predict_one
¶
Predict target value for a single instance.
Source code in deep_river/regression/regressor.py
save
¶
Persist the estimator (architecture, weights, optimiser & runtime state).
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filepath
|
str | Path
|
Destination file. Parent directories are created automatically. |
required |