MINJA sneak attack poisons AI models for other chatbot users
[…] They call their technique MINJA, which stands for Memory INJection Attack. “Nowadays, AI agents typically incorporate a memory bank which stores task queries and executions based on human feedback for future reference,” Zhen Xiang, assistant professor in the school of computing at the University of Georgia, told The Register. “For example, after each session Read more about MINJA sneak attack poisons AI models for other chatbot users[…]