Garak – An Open Source LLM Vulnerability Scanner for AI Red-Teaming
Garak is a free, open-source tool specifically designed to test the robustness and reliability of Large Language Models (LLMs). Inspired by utilities like Nmap or Metasploit, Garak identifies potential weak points in LLMs by probing for issues such as …