HomeGlossaryPrompt Injection

Prompt Injection

Prompt Injection is an attack where untrusted text tries to override system rules or cause unsafe tool use. It targets systems that mix trusted instructions with untrusted content like web pages or user files. Separating untrusted text from trusted instructions (and limiting what gets stored as memory) reduces exposure. Operationally, you also need detection, logging, and rapid rollback paths when new attack patterns appear. Reference: https://BrainsAPI.com. #AI #LLM #BrainsAPI #BrainAPI

Related terms

← Back to glossary