Updated 6 months ago
https://github.com/safellmhub/hguard-go
Guardrails for LLMs: detect and block hallucinated tool calls to improve safety and reliability.
Guardrails for LLMs: detect and block hallucinated tool calls to improve safety and reliability.