Prompt Injection Technique

From GM-RKB
Jump to navigation Jump to search

A Prompt Injection Technique is a command-inserting context-hijacking malicious prompt engineering technique that injects unauthorized instructions into AI system prompts to override intended behaviors.