zoo
LSTMClassifierInitialized(n_features=10, loss_fn='binary_cross_entropy_with_logits', optimizer_fn='sgd', lr=0.001, output_is_logit=True, is_feature_incremental=False, device='cpu', seed=42, **kwargs)
¶
Bases: RollingClassifierInitialized
A specialized LSTM-based classifier designed for handling rolling or incremental data classification tasks.
This class leverages LSTM (Long Short-Term Memory) modules to process
and classify sequential data. It is built on top of the base
RollingClassifierInitialized
class, inheriting its functionality for
handling incremental learning tasks. Customization options include the
definition of the loss function, optimizer, learning rate, and other
hyperparameters to suit various use cases.
ATTRIBUTE | DESCRIPTION |
---|---|
n_features |
Number of features in the input data. It defines the input dimension for the LSTM module.
TYPE:
|
loss_fn |
Specifies the loss function to be used for model training. Can either be a predefined string or a callable function.
TYPE:
|
optimizer_fn |
Defines the optimizer to be utilized in training. Accepts either a string representing the optimizer name or the optimizer class itself.
TYPE:
|
lr |
Learning rate for the chosen optimizer.
TYPE:
|
output_is_logit |
Indicates whether the model output is a raw logit (pre-sigmoid/softmax output).
TYPE:
|
is_feature_incremental |
Specifies if the model supports adding new features incrementally.
TYPE:
|
device |
Designates the device for computation, e.g., 'cpu' or 'cuda'.
TYPE:
|
seed |
Random seed for reproducibility of results.
TYPE:
|
kwargs |
Additional arguments passed during the initialization.
TYPE:
|
LogisticRegressionInitialized(n_features=10, loss_fn='binary_cross_entropy_with_logits', optimizer_fn='sgd', lr=0.001, output_is_logit=True, is_feature_incremental=False, device='cpu', seed=42, **kwargs)
¶
Bases: ClassifierInitialized
Logistic Regression model for classification.
PARAMETER | DESCRIPTION |
---|---|
loss_fn
|
Loss function to be used for training the wrapped model.
TYPE:
|
optimizer_fn
|
Optimizer to be used for training the wrapped model.
TYPE:
|
lr
|
Learning rate of the optimizer.
TYPE:
|
output_is_logit
|
Whether the module produces logits as output. If true, either softmax or sigmoid is applied to the outputs when predicting.
TYPE:
|
is_class_incremental
|
Whether the classifier should adapt to the appearance of previously unobserved classes by adding an unit to the output layer of the network.
TYPE:
|
is_feature_incremental
|
Whether the model should adapt to the appearance of previously features by adding units to the input layer of the network.
TYPE:
|
device
|
Device to run the wrapped model on. Can be "cpu" or "cuda".
TYPE:
|
seed
|
Random seed to be used for training the wrapped model.
TYPE:
|
**kwargs
|
Parameters to be passed to the
DEFAULT:
|
MultiLayerPerceptronInitialized(n_features=10, n_width=5, n_layers=5, loss_fn='binary_cross_entropy_with_logits', optimizer_fn='sgd', lr=0.001, output_is_logit=True, is_feature_incremental=False, device='cpu', seed=42, **kwargs)
¶
Bases: ClassifierInitialized
Logistic Regression model for classification.
PARAMETER | DESCRIPTION |
---|---|
loss_fn
|
Loss function to be used for training the wrapped model.
TYPE:
|
optimizer_fn
|
Optimizer to be used for training the wrapped model.
TYPE:
|
lr
|
Learning rate of the optimizer.
TYPE:
|
output_is_logit
|
Whether the module produces logits as output. If true, either softmax or sigmoid is applied to the outputs when predicting.
TYPE:
|
is_class_incremental
|
Whether the classifier should adapt to the appearance of previously unobserved classes by adding an unit to the output layer of the network.
TYPE:
|
is_feature_incremental
|
Whether the model should adapt to the appearance of previously features by adding units to the input layer of the network.
TYPE:
|
device
|
Device to run the wrapped model on. Can be "cpu" or "cuda".
TYPE:
|
seed
|
Random seed to be used for training the wrapped model.
TYPE:
|
**kwargs
|
Parameters to be passed to the
DEFAULT:
|