Open-source components power nearly all modern software, but they’re often buried deep in massive codebases—hiding severe vulnerabilities. For years, software bills of materials (SBOMs) have been the security community’s key tool to shine a light on these hidden risks. Yet, despite government advancements in the US and Europe, SBOM adoption in the private sector remains sluggish. Now, some experts warn that the rapid rise of AI-assisted coding could soon eclipse the push to make software supply chains more transparent.
“I’m a strong, strong supporter of SBOM, and yet we have this emerging thing that’s happening that fundamentally undermines everything that we’ve been working towards,” Sounil Yu, chief AI officer of Knostic, told CyberScoop. “It is not a far-away future where we should expect to see a near infinite number of varieties of [vulnerability-free software packages] that AI coding systems are going to generate.”
Yu’s optimistic vision, while shared by some, is roundly rejected by many veteran SBOM and software security experts, who say there will likely never be a day when AI can produce vulnerability-free software.
“People are imagining a future where there are no open-source dependencies or there are no reused dependencies, and therefore there’s nothing to put in an SBOM because every piece of the code is bespoke,” Brian Fox, the co-founder and CTO of Sonatype, told CyberScoop. “I think that’s kind of insane.”
Where SBOM policy stands
Developed under an executive order issued under President Joe Biden, the National Telecommunications and Information Administration (NTIA) released the US government’s first official software SBOM document, The Minimum Elements For a Software Bill of Materials (SBOM), in July 2021. That foundational effort was subsequently transferred to the Cybersecurity and Infrastructure Security Agency (CISA).
According to Allan Friedman, who is widely considered the “father” of SBOM and spearheaded that document’s creation, Biden’s order was also clearly intended for SBOMs to be mandated for federal government suppliers under the FAR [Federal Acquisition Regulation], which could have created a transparency floor for all software providers looking to sell into the federal government.
However, neither the National Institute of Standards and Technology (NIST) nor the Office of Management and Budget (OMB) fully spelled out what that requirement would look like, and the hoped-for FAR requirement ended up merely as part of a required software attestation form, according to Friedman, who is now a senior technical adviser at the Institute for Security and Technology (IST).
Two recent developments at CISA have fostered hopes for more widespread and robust SBOMs. On Aug. 22, the agency opened a public comment period for an SBOM guide that aims to update the NTIA document to reflect evolving SBOM practices.
On Sept. 3, CISA, in collaboration with NSA and 19 international partners, released joint guidance outlining the “growing international consensus” for what an SBOM should look like. Participants called the guidance “a significant step forward in strengthening software supply chain transparency and security worldwide.”
As promising as some may find these developments, some experts believe they represent the last vestiges of the Biden administration’s work. Former CISA employee Josh Corman, now an executive in residence for public safety and resilience at IST, told CyberScoop that the minimum elements update and the international framework were actions akin to “the body continuing to move without its head just because of prior commitments to the [Biden] White House.”
SBOM work has stalled under the Trump administration, yet other experts believe there is more is to come from CISA. “[CISA official] Nick Andersen and [CISA director nominee] Sean Plankey are both supporters of these initiatives,” NetRise co-founder and CEO Tom Pace told CyberScoop. He added, “I know that directly. I also know that we have multiple contracts with the federal civilian agencies, including CISA, that are moving forward for SBOM.”
CISA insists that it has not only slowed its work on SBOM—its efforts have increased.
“We are actively involved in several SBOM-related initiatives, including the G7 Cybersecurity Working Group’s Software Bill of Materials for Artificial Intelligence and the review of nearly 100 public comments on our draft SBOM Minimum Elements,” CISA Director of Public Affairs Marci McCarthy told CyberScoop in a statement. “The recently released Shared Vision of SBOM highlights and reinforces our operational collaboration in action with both international and domestic partners to advance the use of SBOMs.”
Aside from CISA’s actions, other developments at the federal level promise to further advance SBOM. The Consolidated Appropriations Act of 2023 amended the Food, Drug, and Cosmetic Act to mandate SBOMs as part of premarket submissions for healthcare devices at the FDA. In 2023, the Pentagon issued guidance that contains recommendations for SBOM management as part of the military’s supply chain risk management strategy.
On the international level, the EU parliament adopted the Cyber Resilience Act (CRA) in March 2024, which will require all manufacturers and distributors of digital products to share a top-level SBOM with market surveillance authorities as part of the technical documentation provided. The legislation calls for these requirements to take effect in December 2027.
Private sector barriers to SBOM adoption
Even with these advancements, most software providers still don’t provide SBOMs, and most organizations don’t demand them from their suppliers. Black Duck’s latest annual analysis found that 86% of commercial codebases contain open-source vulnerabilities, with 81% carrying high- or critical-risk flaws. Meanwhile, 95% of websites continue running outdated software with known issues.
“Surveys are showing that only 30% of people are doing anything about this,” Sonatype’s Fox said. “And that’s largely because it’s optional.”
Corman thinks most organizations find transparency “existentially terrifying.”
“They have license risks where they’re violating terms and conditions of open-source licenses that can be exposed in lawsuits, and they’re not prone to out themselves voluntarily,” he said.
Along the same lines, Steve Springett, chair of the CycloneDX Core Working Group and board vice chair of the OWASP Foundation, told CyberScoop that many organizations fear the legal ramifications of disclosing flaws in their software. “The legal departments in a lot of organizations really don’t want them to unnecessarily disclose more information than what is required for normal business activities.”
Nilesh Jain, co-founder and CEO of cybersecurity startup CleanStart, told CyberScoop, “Most companies that we interact with are still trying to figure out the best way to start generating SBOMs. Some of the largest enterprises and banks and financing institutions still don’t use it.”
Cyber vulnerability expert Art Manion points to the so-called “naming problem,” where there are so many versions of software out there that span multiple years, which are tracked using numerous forms of syntax, that it becomes overwhelming to account for this multiplicity in an SBOM framework.
“Fundamentally, we really are still blocked by not uniformly calling software the same things,” Manion told CyberScoop. “No single source can spend enough time or money or be fast enough to collect and name all the software and keep track of it.”
Friedman, however, thinks this naming problem can be solved “with a little bit of intelligence on the pattern-matching side of things. Instead of trying to build a tool that matches exact string to exact string, we can do some fuzzy matching with a little bit of data science,” he said.
Will generative AI eliminate the need for SBOMs?
While progress on SBOM is slow, there is a simultaneous surge in the adoption and hype cycle of AI-based coding assistants. Some experts believe these tools will reduce or even eliminate software vulnerabilities.
“I’ve created code myself where I’ve instructed my AI coding assistant to go build me some software and not use any software dependencies whatsoever,” Knostic’s Yu told CyberScoop, suggesting that avoiding dependencies on existing code libraries can avoid most, if not all, vulnerabilities. “You can reference the entirety of open source as a template for what to build, but do not actually use any open-source libraries.”
CycloneDX’s Springett agrees with Yu. “It can be done,” he told CyberScoop. “It’s just not being done today, but it can be done. I’ve seen it being done. In the short term, AI is going to propel the number of first-party vulnerabilities that we create. But in the longer term, AI will be a good peer code reviewer and code author, and will always be on the lookout for insecure code and suggest safer alternatives to developers.”
Opinions on whether AI can create vulnerability-free systems are sharply divided. “It’s absolutely not possible,” Manion said. “I have seen no evidence that AI is going to write secure software.”
“That’s basically saying everything we’ve learned in software engineering over the last 60-plus years is just tossed out the window, and none of those things matter,” Sonatype’s Fox said. “If you want to recreate the wheel and make all the same mistakes, good luck, man.”
“I don’t think it’s possible,” Biswajit De, co-founder and CTO of CleanStart, told CyberScoop. “It is physically impossible to give everything in your prompts to create vulnerability-free code.”
Friedman is skeptical as well.
“I have a hard time imagining any tool that is trained in the JavaScript or the node package management system, which is heavily reliant on thousands of dependencies, just then turning around and saying, ‘Well, we can write code without dependencies,’ or if they are writing code, it will use those dependencies in practice,” he told CyberScoop.
He added, “AI-generated code will get better. Anyone who looks at what is being produced today will say, ‘Oh, that’s impressive.’ But large code bases tend to get unwieldy very quickly. You can use AI to try to find and detect vulnerabilities as you write them, but people do that today. There’s nothing magic about AI compared to today’s tools or the future tools.”
