IndustrialCyber

CISA, G7 partners release SBOM for AI guidance to boost AI supply chain transparency and cybersecurity resilience


The U.S. Cybersecurity and Infrastructure Security Agency (CISA), alongside Group of Seven (G7) partners including Germany, Canada, France, Italy, Japan, the U.K., and the European Union, has released new joint guidance aimed at strengthening transparency and cybersecurity across artificial intelligence supply chains. The document outlined that the proposed minimum elements listed are the product of G7 expert consensus and provide actionable guidelines on how AI developers and deployers should implement an SBOM for AI to improve transparency and cybersecurity along the supply chain, thereby contributing to AI governance.

Titled ‘Software Bill of Materials for AI – Minimum Elements,’ the guidance was developed by the G7 Cybersecurity Working Group to help public and private sector organizations identify the core information that should be included in an AI-focused Software Bill of Materials (SBOM). It has been jointly published by Germany’s Federal Office for Information Security (BSI), Italy’s National Cybersecurity Agency (ACN), France’s National Cybersecurity Agency (ANSSI), Canada’s Communications Security Establishment (CSE), the U.S. Cybersecurity and Infrastructure Security Agency (CISA), U.K.’s National Cyber Security Centre (NCSC) and Japan’s National Cybersecurity Office (NCO), in collaboration with the EU Commission.

The G7’s ‘SBOM for AI clusters’ framework is organized into seven core clusters, including Metadata, Models, Dataset Properties (DP), System Level Properties (SLP), Key Performance Indicators (KPI), Security Properties (SP), and Infrastructure. Within the framework, the Metadata cluster contains information related to the SBOM for AI itself, while the remaining clusters are treated as equally important components of the broader AI supply chain transparency and cybersecurity model.

The document expands on the G7’s June 2025 shared vision for AI SBOMs and outlines baseline recommendations intended to improve visibility into the components, dependencies, and risks embedded within AI systems. Drawing from the existing SBOM concept, an SBOM for AI consists of a structured record, or inventory of details and supply chain relationships for the various components used in building an AI system. 

The structured record is divided into different clusters. Each cluster contains ‘elements,’ or information that captures the distinctive features of AI system components. The goal of SBOM for AI is to help secure AI systems and supply chains through transparency and traceability of components and dependencies. Like SBOMs, SBOMs for AI serve as an ingredient list, providing organizations with data they can use to ensure effective IT security processes.

An SBOM functions as an ‘ingredients list’ for software, enabling organizations to better understand software supply chains and make more informed cybersecurity and risk management decisions. The new guidance recognizes that AI systems introduce additional layers of complexity beyond traditional software and recommends supplemental minimum elements tailored specifically to AI environments, in addition to existing SBOM standards.

While the recommendations are voluntary and not intended to be exhaustive, the guidance reflects a consensus among G7 cybersecurity experts and is expected to evolve alongside the rapid advancement of AI technologies. An SBOM for AI structures information that is useful to track vulnerabilities and weaknesses and reduce cybersecurity risks. 

The Metadata cluster is designed to capture information related to the SBOM for AI itself rather than the individual components or sub-elements within the AI system. Details tied to specific components and sub-elements are addressed separately within their respective clusters. The Metadata cluster includes information such as the SBOM author, SBOM version, data format name and version, author signature, tool name and tool version, generation context, timestamp, and dependency relationships.

The System Level Properties (SLP) cluster contains elements related to the AI system as a whole, including system-level information and the internal workings of AI environments composed of multiple AI elements such as classifiers, large language models (LLMs), or AI agents. The cluster also covers software dependencies and frameworks used within the AI system, along with information describing how system components interact and process user data. Elements used to support or deploy the system are addressed separately within the Infrastructure cluster.

The SLP cluster includes information such as the system name, system components, system producer, system version, system timestamp, system data flow, system data usage, system input and output properties, and the intended application area.

The Models cluster contains information used to identify the models within an AI system, describe how model weights were produced, and outline the properties and limitations associated with each model. The cluster includes details such as the model name, model identifier, model version, model timestamp, model producer, model description, model hash value, model hash algorithm, model properties, model input and output properties, model training properties, model license, and model external references.

The Dataset Properties (DP) cluster provides information on datasets used throughout the entire lifecycle of the model, including core details that document the identity and provenance of the data. The cluster includes information such as the dataset name, dataset description, dataset content, dataset identifier, dataset hash, dataset provenance, dataset statistical properties, dataset sensitivity, dataset dependency relationships, and dataset license.

The Infrastructure cluster contains the physical and virtual infrastructure required for the proper operation and support of the AI system. Where applicable, it also includes a link to a Hardware Bill of Materials (HBOM) to account for specialized AI hardware. The cluster includes information related to infrastructure software and infrastructure hardware.

The Security Properties (SP) cluster focuses on the cybersecurity measures associated with AI models and systems. The cluster includes information related to security controls, security compliance, cybersecurity policy information, and vulnerability referencing.

The Key Performance Indicators (KPI) cluster contains elements related to the AI system’s KPIs and those of its components, including AI models integrated within the system, with a focus on their lifecycle phases. The cluster includes information related to security metrics and operational performance KPIs.

Apart from these clusters, the G7 Cybersecurity Working Group considered additional elements that might be useful for an SBOM for AI in the future. One example is the level of decision-making or autonomy of an AI system, which might become more relevant due to the fast-changing developments in technology, particularly around agentic AI. Including such an element in SBOMs for AI could help to assess the impact of a potentially damaging compromise. 

However, while the group recognized the importance and relevance of decision-making or autonomy of an AI system to cybersecurity, it was decided not to explicitly call it out as a separate element. This element may be addressed differently across different jurisdictions, including through safety requirements. 

Besides addressing single elements, the authors highlight that an SBOM for AI by itself is not sufficient for increasing cybersecurity along the supply chain. To ensure substantial protection of the AI supply chain, it is necessary to connect the SBOM for AI to cybersecurity tools, such as vulnerability scanning and management tools, security advisories and bulletins, and promote development of adaptable and evolutionary tooling mechanisms. 

In conclusion, the guideline jointly drafted by the cybersecurity agencies of the G7 Group is a first step towards increasing the supply chain transparency and security of AI models and systems. 

“While there have been multiple efforts dedicated to such an endeavor, the document is meant to cover a minimum set of criteria and does not claim to be exhaustive,” the document disclosed. “Rather, it presents a shared understanding on which elements foster transparency and increase cybersecurity along the AI supply chain. Eventually, an SBOM for AI will help to strengthen the security of the AI supply chain if deployed together with the right cybersecurity tools. This work also seeks to bring added value to stakeholders along the AI supply chain.”



Source link