Skip to content

scaler

Classes:

Name Description
AnomalyMeanScaler

Wrapper around an anomaly detector that scales the model's output

AnomalyMinMaxScaler

Wrapper around an anomaly detector that scales the model's output to

AnomalyScaler

Wrapper around an anomaly detector that scales the output of the model

AnomalyStandardScaler

Wrapper around an anomaly detector that standardizes the model's output

AnomalyMeanScaler

AnomalyMeanScaler(
    anomaly_detector: AnomalyDetector,
    rolling: bool = True,
    window_size=250,
)

Bases: AnomalyScaler

Wrapper around an anomaly detector that scales the model's output by the incremental mean of previous scores.

Parameters:

Name Type Description Default
anomaly_detector AnomalyDetector

The anomaly detector to wrap.

required
metric_type

The type of metric to use.

required
rolling bool

Choose whether the metrics are rolling metrics or not.

True
window_size

The window size used for mean computation if rolling==True.

250

Methods:

Name Description
learn_one

Update the scaler and the underlying anomaly scaler.

score_many

Return scaled anomaly scores based on raw score provided by

score_one

Return a scaled anomaly score based on raw score provided by the

Source code in deep_river/anomaly/scaler.py
def __init__(
    self,
    anomaly_detector: AnomalyDetector,
    rolling: bool = True,
    window_size=250,
):
    super().__init__(anomaly_detector=anomaly_detector)
    self.rolling = rolling
    self.window_size = window_size
    self.mean = utils.Rolling(Mean(), self.window_size) if self.rolling else Mean()

learn_one

learn_one(*args) -> None

Update the scaler and the underlying anomaly scaler.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
AnomalyScaler

The model itself.

Source code in deep_river/anomaly/scaler.py
def learn_one(self, *args) -> None:
    """
    Update the scaler and the underlying anomaly scaler.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector
        is supervised or not.

    Returns
    -------
    AnomalyScaler
        The model itself.
    """

    self.anomaly_detector.learn_one(*args)

score_many abstractmethod

score_many(*args) -> ndarray

Return scaled anomaly scores based on raw score provided by the wrapped anomaly detector.

A high score is indicative of an anomaly. A low score corresponds to a normal observation.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
Scaled anomaly scores. Larger values indicate more anomalous examples.
Source code in deep_river/anomaly/scaler.py
@abc.abstractmethod
def score_many(self, *args) -> np.ndarray:
    """Return scaled anomaly scores based on raw score provided by
    the wrapped anomaly detector.

    A high score is indicative of an anomaly. A low score corresponds
    to a normal observation.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector is
        supervised or not.

    Returns
    -------
    Scaled anomaly scores. Larger values indicate more anomalous examples.
    """

score_one

score_one(*args)

Return a scaled anomaly score based on raw score provided by the wrapped anomaly detector. Larger values indicate more anomalous examples.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
An scaled anomaly score. Larger values indicate more
anomalous examples.
Source code in deep_river/anomaly/scaler.py
def score_one(self, *args):
    """
    Return a scaled anomaly score based on raw score provided by the
    wrapped anomaly detector. Larger values indicate more
    anomalous examples.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector is
        supervised or not.

    Returns
    -------
    An scaled anomaly score. Larger values indicate more
    anomalous examples.
    """
    raw_score = self.anomaly_detector.score_one(*args)
    mean = self.mean.update(raw_score).get()
    score = raw_score / mean

    return score

AnomalyMinMaxScaler

AnomalyMinMaxScaler(
    anomaly_detector: AnomalyDetector,
    rolling: bool = True,
    window_size: int = 250,
)

Bases: AnomalyScaler

Wrapper around an anomaly detector that scales the model's output to \([0, 1]\) using rolling min and max metrics.

Parameters:

Name Type Description Default
anomaly_detector AnomalyDetector

The anomaly detector to wrap.

required
rolling bool

Choose whether the metrics are rolling metrics or not.

True
window_size int

The window size used for the metrics if rolling==True

250

Methods:

Name Description
learn_one

Update the scaler and the underlying anomaly scaler.

score_many

Return scaled anomaly scores based on raw score provided by

score_one

Return a scaled anomaly score based on raw score provided by the

Source code in deep_river/anomaly/scaler.py
def __init__(
    self,
    anomaly_detector: AnomalyDetector,
    rolling: bool = True,
    window_size: int = 250,
):
    super().__init__(anomaly_detector)
    self.rolling = rolling
    self.window_size = window_size
    self.min = RollingMin(self.window_size) if self.rolling else Min()
    self.max = RollingMin(self.window_size) if self.rolling else Min()

learn_one

learn_one(*args) -> None

Update the scaler and the underlying anomaly scaler.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
AnomalyScaler

The model itself.

Source code in deep_river/anomaly/scaler.py
def learn_one(self, *args) -> None:
    """
    Update the scaler and the underlying anomaly scaler.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector
        is supervised or not.

    Returns
    -------
    AnomalyScaler
        The model itself.
    """

    self.anomaly_detector.learn_one(*args)

score_many abstractmethod

score_many(*args) -> ndarray

Return scaled anomaly scores based on raw score provided by the wrapped anomaly detector.

A high score is indicative of an anomaly. A low score corresponds to a normal observation.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
Scaled anomaly scores. Larger values indicate more anomalous examples.
Source code in deep_river/anomaly/scaler.py
@abc.abstractmethod
def score_many(self, *args) -> np.ndarray:
    """Return scaled anomaly scores based on raw score provided by
    the wrapped anomaly detector.

    A high score is indicative of an anomaly. A low score corresponds
    to a normal observation.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector is
        supervised or not.

    Returns
    -------
    Scaled anomaly scores. Larger values indicate more anomalous examples.
    """

score_one

score_one(*args)

Return a scaled anomaly score based on raw score provided by the wrapped anomaly detector. Larger values indicate more anomalous examples.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
An scaled anomaly score. Larger values indicate more
anomalous examples.
Source code in deep_river/anomaly/scaler.py
def score_one(self, *args):
    """
    Return a scaled anomaly score based on raw score provided by the
    wrapped anomaly detector. Larger values indicate more
    anomalous examples.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector is
        supervised or not.

    Returns
    -------
    An scaled anomaly score. Larger values indicate more
    anomalous examples.
    """
    raw_score = self.anomaly_detector.score_one(*args)
    min = self.min.update(raw_score).get()
    max = self.max.update(raw_score).get()
    score = (raw_score - min) / (max - min)

    return score

AnomalyScaler

AnomalyScaler(anomaly_detector: AnomalyDetector)

Bases: Wrapper, AnomalyDetector

Wrapper around an anomaly detector that scales the output of the model to account for drift in the wrapped model's anomaly scores.

Parameters:

Name Type Description Default
anomaly_detector AnomalyDetector

Anomaly detector to be wrapped.

required

Methods:

Name Description
learn_one

Update the scaler and the underlying anomaly scaler.

score_many

Return scaled anomaly scores based on raw score provided by

score_one

Return a scaled anomaly score based on raw score provided by

Source code in deep_river/anomaly/scaler.py
def __init__(self, anomaly_detector: AnomalyDetector):
    self.anomaly_detector = anomaly_detector

learn_one

learn_one(*args) -> None

Update the scaler and the underlying anomaly scaler.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
AnomalyScaler

The model itself.

Source code in deep_river/anomaly/scaler.py
def learn_one(self, *args) -> None:
    """
    Update the scaler and the underlying anomaly scaler.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector
        is supervised or not.

    Returns
    -------
    AnomalyScaler
        The model itself.
    """

    self.anomaly_detector.learn_one(*args)

score_many abstractmethod

score_many(*args) -> ndarray

Return scaled anomaly scores based on raw score provided by the wrapped anomaly detector.

A high score is indicative of an anomaly. A low score corresponds to a normal observation.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
Scaled anomaly scores. Larger values indicate more anomalous examples.
Source code in deep_river/anomaly/scaler.py
@abc.abstractmethod
def score_many(self, *args) -> np.ndarray:
    """Return scaled anomaly scores based on raw score provided by
    the wrapped anomaly detector.

    A high score is indicative of an anomaly. A low score corresponds
    to a normal observation.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector is
        supervised or not.

    Returns
    -------
    Scaled anomaly scores. Larger values indicate more anomalous examples.
    """

score_one abstractmethod

score_one(*args) -> float

Return a scaled anomaly score based on raw score provided by the wrapped anomaly detector.

A high score is indicative of an anomaly. A low score corresponds to a normal observation.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
An scaled anomaly score. Larger values indicate
more anomalous examples.
Source code in deep_river/anomaly/scaler.py
@abc.abstractmethod
def score_one(self, *args) -> float:
    """Return a scaled anomaly score based on raw score provided by
    the wrapped anomaly detector.

    A high score is indicative of an anomaly. A low score corresponds
    to a normal observation.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector
        is supervised or not.

    Returns
    -------
    An scaled anomaly score. Larger values indicate
    more anomalous examples.
    """

AnomalyStandardScaler

AnomalyStandardScaler(
    anomaly_detector: AnomalyDetector,
    with_std: bool = True,
    rolling: bool = True,
    window_size: int = 250,
)

Bases: AnomalyScaler

Wrapper around an anomaly detector that standardizes the model's output using incremental mean and variance metrics.

Parameters:

Name Type Description Default
anomaly_detector AnomalyDetector

The anomaly detector to wrap.

required
with_std bool

Whether to use standard deviation for scaling.

True
rolling bool

Choose whether the metrics are rolling metrics or not.

True
window_size int

The window size used for the metrics if rolling==True.

250

Methods:

Name Description
learn_one

Update the scaler and the underlying anomaly scaler.

score_many

Return scaled anomaly scores based on raw score provided by

score_one

Return a scaled anomaly score based on raw score provided by the

Source code in deep_river/anomaly/scaler.py
def __init__(
    self,
    anomaly_detector: AnomalyDetector,
    with_std: bool = True,
    rolling: bool = True,
    window_size: int = 250,
):
    super().__init__(anomaly_detector)
    self.rolling = rolling
    self.window_size = window_size
    self.mean = utils.Rolling(Mean(), self.window_size) if self.rolling else Mean()
    self.sq_mean = (
        utils.Rolling(Mean(), self.window_size) if self.rolling else Mean()
    )
    self.with_std = with_std

learn_one

learn_one(*args) -> None

Update the scaler and the underlying anomaly scaler.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
AnomalyScaler

The model itself.

Source code in deep_river/anomaly/scaler.py
def learn_one(self, *args) -> None:
    """
    Update the scaler and the underlying anomaly scaler.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector
        is supervised or not.

    Returns
    -------
    AnomalyScaler
        The model itself.
    """

    self.anomaly_detector.learn_one(*args)

score_many abstractmethod

score_many(*args) -> ndarray

Return scaled anomaly scores based on raw score provided by the wrapped anomaly detector.

A high score is indicative of an anomaly. A low score corresponds to a normal observation.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
Scaled anomaly scores. Larger values indicate more anomalous examples.
Source code in deep_river/anomaly/scaler.py
@abc.abstractmethod
def score_many(self, *args) -> np.ndarray:
    """Return scaled anomaly scores based on raw score provided by
    the wrapped anomaly detector.

    A high score is indicative of an anomaly. A low score corresponds
    to a normal observation.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector is
        supervised or not.

    Returns
    -------
    Scaled anomaly scores. Larger values indicate more anomalous examples.
    """

score_one

score_one(*args)

Return a scaled anomaly score based on raw score provided by the wrapped anomaly detector. Larger values indicate more anomalous examples.

Parameters:

Name Type Description Default
*args

Depends on whether the underlying anomaly detector is supervised or not.

()

Returns:

Type Description
An scaled anomaly score. Larger values indicate more
anomalous examples.
Source code in deep_river/anomaly/scaler.py
def score_one(self, *args):
    """
    Return a scaled anomaly score based on raw score provided by the
    wrapped anomaly detector. Larger values indicate more
    anomalous examples.

    Parameters
    ----------
    *args
        Depends on whether the underlying anomaly detector
        is supervised or not.

    Returns
    -------
    An scaled anomaly score. Larger values indicate more
    anomalous examples.
    """
    raw_score = self.anomaly_detector.score_one(*args)
    mean = self.mean.update(raw_score).get()
    if self.with_std:
        var = (
            self.sq_mean.update(raw_score**2).get() - mean**2
        )  # todo is this correct?
        score = (raw_score - mean) / var**0.5
    else:
        score = raw_score - mean

    return score