Updated 6 months ago

https://github.com/safellmhub/hguard-go • Science 26%

Guardrails for LLMs: detect and block hallucinated tool calls to improve safety and reliability.