You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Refactor LLM Alerts section to focus on anomaly-agent approach
- Update description to highlight new agent-based anomaly detection
- Remove references to OpenAI and Anthropic direct integrations
- Add note about future Ollama support
- Adjust formatting and terminology
Copy file name to clipboardExpand all lines: README.md
+7-10Lines changed: 7 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -36,7 +36,7 @@ Painless open source anomaly detection for your metrics! 📈📉🚀
36
36
-[Visualization](#visualization)
37
37
-[Concepts](#concepts)
38
38
-[Alerts](#alerts)
39
-
-[LLM Alerts](#llm-alerts)
39
+
-[LLM Agent Alerts](#llm-agent-alerts)
40
40
-[Contributing](#contributing)
41
41
42
42
Supported sources and databases for your metrics to live in and be queried from:
@@ -172,7 +172,7 @@ Here is a list of features of Anomstack (emoji alert warning!)
172
172
5. 🛠️ - Ability to define your own custom python preprocess function instead of the default at [`/metrics/defaults/python/preprocess.py`](./metrics/defaults/python/preprocess.py).
173
173
6. 📧 - Email [alerting](#alerts) with fancy(ish) ascii art plots of your metrics and anomaly scores.
174
174
7. 💬 - Slack alerts too (want to make these nicer).
175
-
8. 🤖 - LLM based alerts ([OpenAI](./anomstack/llm/openai.py) & [Anthropic](./anomstack/llm/anthropic.py)) - see [LLM Alerts](#llm-alerts). p.s. they don't work great yet - experimental :)
175
+
8. 🤖 - aGeNtIc LLM based alerts - use an [anomaly-agent](https://github.com/andrewm4894/anomaly-agent) to do anomaly detection and alerting - see [LLM Agent Alerts](#llm-agent-alerts).
176
176
9. 🕒 - Ability to ingest at whatever frequency you want and then agg to a different level for training/scoring, see [`freq`](/metrics/examples/freq/README.md) example.
177
177
10. 📊 - Plot jobs so you can just eyeball your metrics in Dagster job logs, see [#dagster-ui-plots](#dagster-ui-plots).
178
178
11. 🏗️ - Minimal infrastructure requirements, Anomstack just reads from and writes to whatever database you use.
@@ -557,21 +557,18 @@ Below is an example of an alert via email. Attached is a png plot with more deta
557
557
558
558

559
559
560
-
## LLM Alerts
560
+
## LLM Agent Alerts
561
561
562
562
[back to top](#anomstack)
563
563
564
-
Yes! I have managed to find a way to ram a large language model (LLM) into this project. But you know what, it might just work...
564
+
Yes! I have managed to find a way to ram a large language (LLM) ~~model~~ 🚀AGENT🚀 into this project.
565
565
566
-
~~**Update**: It works horribly, but it works! 🤣. Still need to do a lot more prompt engineering to get this to work well, but it's a start.~~
567
-
568
-
**Update Update**: I know how to make this work much better and more reliable + latest models are better - going to refactor this soon (done [here](https://github.com/andrewm4894/anomstack/pull/127)).
569
-
570
-
Idea here is to just send the metric data and prompt to a LLM (ChatGPT) and ask it if it thinks the metric looks anomalous (and provide back an explanation). If it does, we alert.
566
+
Idea here is to just send the metric data and prompt to an LLM Agent (built with the [anomaly-agent](https://github.com/andrewm4894/anomaly-agent)) and ask it if it thinks the metric looks anomalous (and run a verification chain to check it and also provide back an explanation for each anomaly). If it does, we alert.
571
567
572
568
Notes:
573
569
- If you don't want to send your metric data to OpenAI then just set `disable_llmalert` to `True` in your metric batch config.
574
-
- Support for Anthropic models added [here](https://github.com/andrewm4894/anomstack/pull/128)
570
+
- Going to add and validate [Ollama](https://ollama.com/) support soon for local LLM Agent.
571
+
- Will be developing this more in its own library (for use in both Anomstack and any other projects) over at [anomaly-agent](https://github.com/andrewm4894/anomaly-agent) dig in there for more details.
575
572
576
573
<details>
577
574
<summary>Click to see some LLM Alert screenshots</summary>
0 commit comments