https://github.com/aeternalis-ingenium/anomalytics
The ultimate anomaly detection and its analytics.
Science Score: 39.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 6 DOI reference(s) in README -
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.6%) to scientific vocabulary
Keywords
Keywords from Contributors
Repository
The ultimate anomaly detection and its analytics.
Basic Info
Statistics
- Stars: 5
- Watchers: 1
- Forks: 1
- Open Issues: 4
- Releases: 11
Topics
Metadata Files
README.md
Anomalytics
Your Ultimate Anomaly Detection & Analytics Tool
Introduction
anomalytics is a Python library that aims to implement all statistical methods for the purpose of detecting any sort of anomaly e.g. extreme events, high or low anomalies, etc. This library utilises external dependencies such as:
- Pandas 2.1.1
- NumPy 1.26.0
- SciPy 1.11.3
- Matplotlib 3.8.2
- Pytest-Cov 4.1.0.
- Black 23.10.0
- Isort 5.12.0
- MyPy 1.6.1
- Bandit 1.7.5
anomalytics supports the following Python's versions: 3.10.x, 3.11.x, 3.12.0.
Installation
To use the library, you can install as follow:
```shell
Install without openpyxl
$ pip3 install anomalytics
Install with openpyxl
$ pip3 install "anomalytics[extra]" ```
As a contributor/collaborator, you may want to consider installing all external dependencies for development purposes:
```shell
Install bandit, black, isort, mypy, openpyxl, pre-commit, and pytest-cov
$ pip3 install "anomalytics[codequality,docs,security,testcov,extra]" ```
Use Case
anomalytics can be used to analyze anomalies in your dataset (both as pandas.DataFrame or pandas.Series). To start, let's follow along with this minimum example where we want to detect extremely high anomalies in our dataset.
Read the walkthrough below, or the concrete examples here: * Extreme Anomaly Analysis - DataFrame * Battery Water Level Analysis - Time Series
Anomaly Detection via the Detector Instance
Import
anomalyticsand initialise our time series of 100_002 rows:```python import anomalytics as atics
df = atics.readts("./adimpressions.csv", "csv") df.head()
shelldatetime xandr gam adobe0 2023-10-18 09:01:00 52.483571 71.021131 35.681915 1 2023-10-18 09:02:00 49.308678 73.651996 60.347246 2 2023-10-18 09:03:00 53.238443 65.690813 48.120805 3 2023-10-18 09:04:00 57.615149 80.944393 59.550775 4 2023-10-18 09:05:00 48.829233 76.445099 26.710413 ```
Initialize the needed detector object. Each detector utilises a different statistical method for detecting anomalies. In this example, we'll use POT method and a high anomaly type. Pay attention to the time period that is directly created where the
t2is 1 by default because "real-time" always targets the "now" period hence 1 (sec, min, hour, day, week, month, etc.):```python potdetector = atics.getdetector(method="POT", dataset=ts, anomaly_type="high")
print(f"T0: {potdetector.t0}") print(f"T1: {potdetector.t1}") print(f"T2: {pot_detector.t2}")
pot_detector.plot(ptype="line-dataset-df", title=f"Page Impressions Dataset", xlabel="Minute", ylabel="Impressions", alpha=1.0)
shell T0: 42705 T1: 16425 T2: 6570 ```
The purpose of using the detector object instead the standalone is to have a simple fix detection flow. In case you want to customize the time window, we can call the
reset_time_window()to resett2value, even though that will beat the purpose of using a detector object. Pay attention to the period parameters because the method expects a percentage representation of the distribution of period (ranging 0.0 to 1.0):```python potdetector.resettimewindow( "historical", t0pct=0.65, t1pct=0.25, t2pct=0.1 )
print(f"T0: {potdetector.t0}") print(f"T1: {potdetector.t1}") print(f"T2: {pot_detector.t2}")
pot_detector.plot(ptype="hist-dataset-df", title="Dataset Distributions", xlabel="Distributions", ylabel="Page Impressions", alpha=1.0, bins=100)
shell T0: 65001 T1: 25001 T2: 10000 ```
Now, we can extract exceedances by giving the expected
quantile:python pot_detector.get_extremes(0.95) pot_detector.exeedance_thresholds.head()shell xandr gam adobe datetime 0 58.224653 85.177029 60.362306 2023-10-18 09:01:00 1 58.224653 85.177029 60.362306 2023-10-18 09:02:00 2 58.224653 85.177029 60.362306 2023-10-18 09:03:00 3 58.224653 85.177029 60.362306 2023-10-18 09:04:00 4 58.224653 85.177029 60.362306 2023-10-18 09:05:00Let's visualize the exceedances and its threshold to have a clearer understanding of our dataset:
python pot_detector.plot(ptype="line-exceedance-df", title="Peaks Over Threshold", xlabel="Minute", ylabel="Page Impressions", alpha=1.0)
Now that we have the exceedances, we can fit our data into the chosen distribution, in this example the "Generalized Pareto Distribution". The first couple rows will be zeroes which is normal because we only fit data that are greater than zero into the wanted distribution:
python pot_detector.fit() pot_detector.fit_result.head()shell xandr_anomaly_score gam_anomaly_score adobe_anomaly_score total_anomaly_score datetime 0 1.087147 0.000000 0.000000 1.087147 2023-11-17 00:46:00 1 0.000000 0.000000 0.000000 0.000000 2023-11-17 00:47:00 2 0.000000 0.000000 0.000000 0.000000 2023-11-17 00:48:00 3 0.000000 1.815875 0.000000 1.815875 2023-11-17 00:49:00 4 0.000000 0.000000 0.000000 0.000000 2023-11-17 00:50:00 ...Let's inspect the GPD distributions to get the intuition of our pareto distribution:
python pot_detector.plot(ptype="hist-gpd-df", title="GPD - PDF", xlabel="Page Impressions", ylabel="Density", alpha=1.0, bins=100)
The parameters are stored inside the detector class:
python pot_detector.paramsshell {0: {'xandr': {'c': -0.11675297447288158, 'loc': 0, 'scale': 2.3129766056305603, 'p_value': 0.9198385927065513, 'anomaly_score': 1.0871472537998}, 'gam': {'c': 0.0, 'loc': 0.0, 'scale': 0.0, 'p_value': 0.0, 'anomaly_score': 0.0}, 'adobe': {'c': 0.0, 'loc': 0.0, 'scale': 0.0, 'p_value': 0.0, 'anomaly_score': 0.0}, 'total_anomaly_score': 1.0871472537998}, 1: {'xandr': {'c': 0.0, 'loc': 0.0, 'scale': 0.0, 'p_value': 0.0, 'anomaly_score': 0.0}, 'gam': {'c': 0.0, 'loc': 0.0, 'scale': 0.0, 'p_value': 0.0, ... 'scale': 0.0, 'p_value': 0.0, 'anomaly_score': 0.0}, 'total_anomaly_score': 0.0}, ...}Last but not least, we can now detect the extremely large (high) anomalies:
python pot_detector.detect(0.95) pot_detector.detection_resultshell 16425 False 16426 False 16427 False 16428 False 16429 False ... 22990 False 22991 False 22992 False 22993 False 22994 False Name: detected data, Length: 6570, dtype: boolNow we can visualize the anomaly scores from the fitting with the anomaly threshold to get the sense of the extremely large values:
python pot_detector.plot(ptype="line-anomaly-score-df", title="Anomaly Score", xlabel="Minute", ylabel="Page Impressions", alpha=1.0)
Now what? Well, while the detection process seems quite straight forward, in most cases getting the details of each anomalous data is quite tidious! That's why
anomalyticsprovides a comfortable method to get the summary of the detection so we can see when, in which row, and how the actual anomalous data look like:python pot_detector.detection_summary.head(5)shell row xandr gam adobe xandr_anomaly_score gam_anomaly_score adobe_anomaly_score total_anomaly_score anomaly_threshold 2023-11-28 12:06:00 59225 64.117135 76.425925 47.772929 21.445759 0.000000 0.000000 21.445759 19.689885 2023-11-28 12:25:00 59244 40.513415 94.526021 65.921644 0.000000 19.557962 2.685337 22.243299 19.689885 2023-11-28 12:45:00 59264 52.362039 54.191719 79.972860 0.000000 0.000000 72.313273 72.313273 19.689885 2023-11-28 16:48:00 59507 64.753203 70.344142 42.540168 32.543021 0.000000 0.000000 32.543021 19.689885 2023-11-28 16:53:00 59512 35.912221 52.572939 75.621003 0.000000 0.000000 22.199505 22.199505 19.689885In every good analysis there is a test! We can evaluate our analysis result with "Kolmogorov Smirnov" 1 sample test to see how far the statistical distance between the observed sample distributions to the theoretical distributions via the fitting parameters (the smaller the
stats_distancethe better!):python pot_detector.evaluate(method="ks") pot_detector.evaluation_resultshell column total_nonzero_exceedances stats_distance p_value c loc scale 0 xandr 3311 0.012901 0.635246 -0.128561 0 2.329005 1 gam 3279 0.011006 0.817674 -0.140479 0 3.852574 2 adobe 3298 0.019479 0.161510 -0.133019 0 6.007833If 1 test is not enough for evaluation, we can also visually test our analysis result with "Quantile-Quantile Plot" method to observed the sample quantile vs. the theoretical quantile:
```python
Use the last non-zero parameters
pot_detector.evaluate(method="qq")
Use a random non-zero parameters
potdetector.evaluate(method="qq", israndom=True) ```

Anomaly Detection via Standalone Functions
You have a project that only needs to be fitted? To be detected? Don't worry! anomalytics also provides standalone functions as well in case users want to start the anomaly analysis from a different starting points. It is more flexible, but many processing needs to be done by you. LEt's take an example with a different dataset, thistime the water level Time Series!
Import
anomalyticsand initialise your time series:```python import anomalytics as atics
ts = atics.readts( "waterlevel.csv", "csv" ) ts.head()
shell 2008-11-03 06:00:00 0.219 2008-11-03 07:00:00 -0.041 2008-11-03 08:00:00 -0.282 2008-11-03 09:00:00 -0.368 2008-11-03 10:00:00 -0.400 Name: Water Level, dtype: float64 ```Set the time windows of t0, t1, and t2 to compute dynamic expanding period for calculating the threshold via quantile:
```python t0, t1, t2 = atics.settimewindow( totalrows=ts.shape[0], method="POT", analysistype="historical", t0pct=0.65, t1pct=0.25, t2_pct=0.1 )
print(f"T0: {t0}") print(f"T1: {t1}") print(f"T2: {t2}")
shell T0: 65001 T1: 25001 T2: 10000 ```Extract exceedances and indicate that it is a
"high"anomaly type and what's thequantile:```python potthresholds = getthresholdpeaksoverthreshold(dataset=ts, t0=t0, "high", q=0.90) potexceedances = atics.getexceedancepeaksoverthreshold( dataset=ts, thresholddataset=potthresholds, anomaly_type="high" )
exceedances.head()
shell 2008-11-03 06:00:00 0.859 2008-11-03 07:00:00 0.859 2008-11-03 08:00:00 0.859 2008-11-03 09:00:00 0.859 2008-11-03 10:00:00 0.859 Name: Water Level, dtype: float64 ```Compute the anomaly scores for each exceedance and initialize a params for further analysis and evaluation:
```python params = {} anomalyscores = atics.getanomalyscore( exceedancedataset=potexceedances, t0=t0, gpdparams=params )
anomaly_scores.head()
shell 2016-04-03 15:00:00 0.0 2016-04-03 16:00:00 0.0 2016-04-03 17:00:00 0.0 2016-04-03 18:00:00 0.0 2016-04-03 19:00:00 0.0 Name: anomaly scores, dtype: float64 ... ```Inspect the parameters:
python paramsshell {0: {'index': Timestamp('2016-04-03 15:00:00'), 'c': 0.0, 'loc': 0.0, 'scale': 0.0, 'p_value': 0.0, 'anomaly_score': 0.0}, 1: {'index': Timestamp('2016-04-03 16:00:00'), ... 'c': 0.0, 'loc': 0.0, 'scale': 0.0, 'p_value': 0.0, 'anomaly_score': 0.0}, ...}Detect anomalies:
```python anomalythreshold = getanomalythreshold( anomalyscoredataset=anomalyscores, t1=t1, q=0.90 ) detectionresult = getanomaly( anomalyscoredataset=anomalyscores, threshold=anomalythreshold, t1=t1 )
detection_result.head()
shell 2020-03-31 19:00:00 False 2020-03-31 20:00:00 False 2020-03-31 21:00:00 False 2020-03-31 22:00:00 False 2020-03-31 23:00:00 False Name: anomalies, dtype: bool ```For the test, kolmogorov-smirnov and qq plot are also accessible via standalone functions, but the params need to be processed so it only contains a non-zero parameters since there are no reasons to calculate a zero 😂
```python nonzero_params = []
for row in range(0, t1 + t2): if ( params[row]["c"] != 0 or params[row]["loc"] != 0 or params[row]["scale"] != 0 ): nonzero_params.append(params[row])
ksresult = atics.evals.ks1sample( dataset=potexceedances, statsmethod="POT", fitparams=nonzeroparams )
ksresult
shell {'totalnonzeroexceedances': [5028], 'statsdistance': [0.0284] 'p_value': [0.8987], 'c': [0.003566], 'loc': [0], 'scale': [0.140657]} ```Visualize via qq plot:
```python nonzero_exceedances = exceedances[exceedances.values > 0]
visualizeqqplot( dataset=nonzeroexceedances, statsmethod="POT", fitparams=nonzeroparams, ) ```
Sending Anomaly Notification
We have anomaly you said? Don't worry, anomalytics has the implementation to send an alert via E-Mail or Slack. Just ensure that you have your email password or Slack webhook ready. This example shows both application (please read the comments 😎):
Initialize the wanted platform:
```python
Gmail
gmail = atics.getnotification( platform="email", senderaddress="my-cool-email@gmail.com", password="AIUEA13", recipientaddresses=["my-recipient-1@gmail.com", "my-recipient-2@web.de"], smtphost="smtp.gmail.com", smtp_port=876, )
Slack
slack = atics.getnotification( platform="slack", webhookurl="https://slack.com/my-slack/YOUR/SLACK/WEBHOOK", )
print(gmail) print(slack)
shell 'Email Notification' 'Slack Notification' ```Prepare the data for the notification! If you use standalone, you need to process the
detection_resultto become a DataFrame withrow, `````python
Standalone
detectedanomalies = detectionresult[detectionresult.values == True] anomalousdata = ts[detectedanomalies.index] standalonedetectionsummary = pd.DataFrame( index=anomalous.index.flatten(), data=dict( row=[ts.index.getloc(index) + 1 for index in anomalous.index], anomalousdata=[data for data in anomalous.values], anomalyscore=[score for score in anomalyscore[anomalous.index].values], anomalythreshold=[anomaly_threshold] * anomalous.shape[0], ) )
Detector Instance
detectordetectionsummary = potdetector.detectionsummary
```
Prepare the notification payload and a custome message if needed:
```python
Email
gmail.setup( detectionsummary=detectionsummary, message="Extremely large anomaly detected! From Ad Impressions Dataset!" )
Slack
slack.setup( detectionsummary=detectionsummary, message="Extremely large anomaly detected! From Ad Impressions Dataset!" ) ```
Send your notification! Beware that the scheduling is not implemented since it always depends on the logic of the use case:
```python
Email
gmail.send
Slack
slack.send
shell 'Notification sent successfully.' ```Check your email or slack, this example produces the following notification via Slack:

Reference
Nakamura, C. (2021, July 13). On Choice of Hyper-parameter in Extreme Value Theory Based on Machine Learning Techniques. arXiv:2107.06074 [cs.LG]. https://doi.org/10.48550/arXiv.2107.06074
Davis, N., Raina, G., & Jagannathan, K. (2019). LSTM-Based Anomaly Detection: Detection Rules from Extreme Value Theory. In Proceedings of the EPIA Conference on Artificial Intelligence 2019. https://doi.org/10.48550/arXiv.1909.06041
Arian, H., Poorvasei, H., Sharifi, A., & Zamani, S. (2020, November 13). The Uncertain Shape of Grey Swans: Extreme Value Theory with Uncertain Threshold. arXiv:2011.06693v1 [econ.GN]. https://doi.org/10.48550/arXiv.2011.06693
Yiannis Kalliantzis. (n.d.). Detect Outliers: Expert Outlier Detection and Insights. Retrieved [23-12-04T15:10:12.000Z], from https://detectoutliers.com/
Wall of Fame
I am deeply grateful to have met and guided by wonderful people who inspired me to finish my capstone project for my study at CODE university of applied sciences in Berlin (2023). Thank you so much for being you!
- Sabrina Lindenberg
- Adam Roe
- Alessandro Dolci
- Christian Leschinski
- Johanna Kokocinski
- Peter Krauß
Owner
- Name: N. L.
- Login: Aeternalis-Ingenium
- Kind: user
- Location: Berlin
- Company: PLENO
- Repositories: 10
- Profile: https://github.com/Aeternalis-Ingenium
GitHub Events
Total
- Push event: 17
Last Year
- Push event: 17
Committers
Last synced: over 1 year ago
Top Committers
| Name | Commits | |
|---|---|---|
| N. L | 1****m | 27 |
| dependabot[bot] | 4****] | 8 |
| pre-commit-ci[bot] | 6****] | 5 |
| N. L. | n****o@p****h | 1 |
| Adam Roe | 1****e | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 11
- Total pull requests: 76
- Average time to close issues: 2 days
- Average time to close pull requests: 5 days
- Total issue authors: 2
- Total pull request authors: 4
- Average comments per issue: 0.09
- Average comments per pull request: 0.76
- Merged pull requests: 41
- Bot issues: 1
- Bot pull requests: 49
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- Aeternalis-Ingenium (10)
- dependabot[bot] (1)
Pull Request Authors
- dependabot[bot] (41)
- Aeternalis-Ingenium (26)
- pre-commit-ci[bot] (8)
- DrAdamRoe (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 131 last-month
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 12
- Total maintainers: 1
pypi.org: anomalytics
The ultimate anomaly detection library.
- Documentation: https://anomalytics.readthedocs.io/
- License: MIT License Copyright (c) 2023 Nino Lindenberg Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
-
Latest release: 0.2.2
published about 2 years ago
Rankings
Maintainers (1)
Dependencies
- matplotlib >=3.7.2
- numpy >=1.25.2
- pandas >=2.0.3
- scipy >=1.10.4
- actions/checkout v4.1.0 composite
- actions/setup-python v4.7.0 composite
- actions/checkout v4.1.0 composite
- actions/download-artifact v3 composite
- actions/setup-python v4.7.0 composite
- actions/upload-artifact v3 composite
- pypa/gh-action-pypi-publish release/v1 composite
- sigstore/gh-action-sigstore-python v1.2.3 composite
- actions/checkout v4.1.0 composite
- actions/download-artifact v3 composite
- actions/setup-python v4.7.0 composite
- actions/upload-artifact v3 composite
- pypa/gh-action-pypi-publish release/v1 composite
- actions/checkout v4.1.0 composite
- actions/setup-python v4.7.0 composite
- actions/checkout v4.1.0 composite
- actions/setup-python v4.7.0 composite
- codecov/codecov-action v3.1.4 composite