114. Prompt Injection

  • A technique where malicious prompts are inserted into a model's input to manipulate its output, often to bypass ethical guidelines or security measures. ​Axios

Last updated