AI isn't closing the threat intelligence gap nearly as much as many suspect it should. In many organizations, it is actually ...
Agentic AI introduces new security risks. Learn how the OWASP Top 10 Risks for Agentic Applications maps to real mitigations ...
The OWASP Top 10 for LLM Applications is the most widely referenced framework for understanding these risks. First released in 2023, OWASP updated the list in late 2024 to reflect real-world incidents ...
The rapid evolution of AI has moved us beyond simple chatbots into the era of agentic applications, systems that can plan, reason, and act autonomously across multiple steps. From finance and ...
Add Yahoo as a preferred source to see more of our stories on Google. Data poisoning can make an AI system dangerous to use, potentially posing threats such as chemically poisoning a food or water ...
Data poisoning is a type of cyberattack in which a bad actor intentionally compromises a training dataset used by an AI model by introducing malicious or corrupted data. The goal is to manipulate the ...
Authentication Failures (A07) show the largest gap in the dataset: a 48-percentage-point difference between leaders and the field. Leaders fix at nearly 60%, while the field sits at roughly 12%.
Sensitive information disclosure via large language models (LLMs) and generative AI has become a more critical risk as AI adoption surges, according to the Open Worldwide Application Security Project ...
Nathan Eddy works as an independent filmmaker and journalist based in Berlin, specializing in architecture, business technology and healthcare IT. He is a graduate of Northwestern University’s Medill ...
Data poisoning presents an imposing cyberthreat to artificial intelligence amid agencies’ digital transformations because it’s designed to be subtle. Unlike traditional cyberattacks that focus on ...
Editor’s note: These are big, complex topics — so we've spent more time exploring them. Welcome to GT Spotlight. Have an idea for a feature? Email Associate Editor Zack Quaintance at ...