AI Package Hallucination – Abusing ChatGPT, Gemini to Spread Malware
The research investigates the persistence and scale of AI package hallucination, a technique where LLMs recommend non-existent malicious packages. The Langchain framework has allowed for…