Researchmemory poisoningprompt injectionllm
Companies Deploy Hidden Prompts To Poison AI Memory
10.0
Relevance Score
Microsoft security researchers reported a growing trend called AI Recommendation Poisoning, where companies embed hidden instructions in 'Summarize with AI' buttons to inject persistent 'remember' prompts into assistants via URL parameters. Over 60 days they identified 50 prompt-based attempts from 31 companies across 14 industries, and Microsoft says it has implemented mitigations in Copilot. The technique can bias recommendations on health, finance, and other critical topics.



