ITSecurityGuru

Black Duck Launches Signal to Tackle the Security Risks of AI-Generated Code


Black Duck has announced the general availability of Black Duck Signal, an agentic AI application security solution designed from the ground up to address the security challenges created by AI-native software development. The launch comes as AI coding assistants move from novelty to norm across enterprise software teams. Industry analysts predict that 90% of enterprise developers will be using AI coding tools by 2028, a shift that is fundamentally changing the volume, velocity, and nature of the code hitting production systems. The problem, according to Black Duck, is that the security tools designed to protect that code have not kept pace.

“AI is no longer just accelerating development, it’s actively authoring software,” said Jason Schmitt, CEO of Black Duck. “Signal unlocks AI-driven development by removing risk and bringing intelligence, determinism and governance to that reality.”

A Different Architecture for a Different Problem

Unlike traditional application security testing (AST) tools that rely on language-specific, rule-based scanning engines, Signal is built on an agentic AI architecture. Rather than a single model, it deploys a coordinated system of specialised AI security agents that work together to analyse code, assess the exploitability of vulnerabilities, prioritise risk, and recommend or automatically apply fixes, reasoning through issues with what Black Duck describes as human-like logic.

Central to that intelligence is ContextAI, Black Duck’s purpose-built application security model, trained on petabytes of human-validated security data accumulated over more than two decades. The company argues that this grounding in real-world security expertise is what separates Signal from general-purpose AI security tools: the agents aren’t just pattern-matching against known signatures, they’re drawing on deep contextual knowledge to make informed judgements about risk and remediation.

That distinction matters particularly for the types of vulnerabilities that are hardest to catch; complex, cross-file dataflow issues, business logic errors, and novel defects that don’t match any existing rule or signature. Signal’s multi-model approach means that different agents are applied at different stages of analysis, with Black Duck claiming each is optimised for the task at hand.

Proof in the Wild

Black Duck has pointed to a concrete real-world example to demonstrate Signal’s capabilities. The company’s Cybersecurity Research Center used Signal to identify a previously undisclosed authentication bypass vulnerability in Gitea, the popular open source Git platform, before it was publicly known. The finding, Black Duck says, illustrates Signal’s ability to surface high-impact logic flaws that conventional tools would miss entirely.

Built for Where Code Actually Gets Written

Signal integrates directly into the tools developers already use: AI coding assistants, IDEs, and automated pipelines, via model context protocol (MCP) and APIs. It analyses code continuously as it is written or generated, surfacing issues before they reach a commit rather than flagging them after the fact. Where traditional AST tools are known for high false positive rates that erode developer trust, Signal’s built-in exploitability analysis is designed to filter out non-issues and surface only what genuinely matters.

Because its intelligence is model-driven rather than rule-driven, Signal is also language and framework-agnostic from day one. It requires no rule updates, no language packs, and no tuning, meaning organisations are not left waiting for vendor support to catch up with the latest language features or frameworks used in their AI-generated code.

Governance at AI Scale

Beyond detection, Black Duck frames Signal as an enterprise governance tool. As AI coding assistants increasingly design and deliver production software autonomously, organisations face mounting challenges around security, compliance, and trust. Signal is positioned to give security and engineering leaders the visibility and control they need to govern AI-generated software at scale, without sacrificing the development velocity that AI tools are intended to deliver.

Black Duck Signal is now generally available.

The post Black Duck Launches Signal to Tackle the Security Risks of AI-Generated Code appeared first on IT Security Guru.



Source link