Government Cyber Agencies Release Guidance on Software Bills of Materials for AI
In a significant stride towards enhancing cybersecurity within the artificial intelligence (AI) supply chain, multiple government cyber agencies have released a comprehensive resource that outlines the essential components necessary for Software Bills of Materials (SBOMs) specifically tailored for AI. This initiative aims to enhance transparency and promote best practices among stakeholders across both public and private sectors, ultimately fortifying the integrity of AI systems and their supply chains.
The document titled Software Bill of Materials (SBOM) for Artificial Intelligence – Minimum Elements, was published on May 12 and marks a collaborative effort by the G7 Cybersecurity Working Group. Building upon discussions initiated in June 2025 regarding a shared vision for SBOMs in AI, this publication represents a concerted effort to develop a structured framework for understanding and improving the cybersecurity posture in AI systems.
Core Framework: The Seven SBOM for AI Clusters
Central to the paper’s guidance is the introduction of seven distinct "clusters" of elements that serve as potential standards for both producers and users of AI systems. These clusters are intended to encapsulate vital components necessary for comprehensive documentation and assessment of AI supply chains. The seven clusters are:
-
Metadata: This cluster is designed to house information pertinent to the SBOM itself, rather than the individual components or sub-elements of an AI system. This allows users to understand the framework within which the AI operates.
-
System Level Properties (SLP): This cluster contains critical data about the overall AI system, including software dependencies and frameworks, as well as details on how various components interact and process user data.
-
Models: This segment focuses on providing essential identification information for the models used within the AI system, along with descriptions of how their weights are generated, alongside outlining their properties and limitations.
-
Dataset Properties (DP): The DP cluster informs users about the datasets employed throughout the model’s lifecycle, emphasizing the identity and provenance of the data utilized.
-
Key Performance Indicators (KPI): This cluster encompasses information on the performance indicators of the AI system and its components, particularly emphasizing the lifecycle phases of integrated AI models.
-
Infrastructure: This cluster details the physical and virtual infrastructure necessary for the AI system’s operation and support. It may include references to a Hardware Bill of Materials (HBOM), which covers specialized AI hardware as well.
- Security Properties (SP): The final cluster centers on cybersecurity measures pertinent to AI models and systems, ensuring that security is a foundational aspect of the AI supply chain.
The document emphasizes that, although the Metadata cluster contains specific details about the SBOM for AI itself, the other clusters carry equal weight and significance.
Navigating Challenges: SBOMs and Their Limitations
While the SBOM clusters serve as an important resource, the document also acknowledges that they are not mandatory and are subject to further refinement. Allan Friedman, a leading figure in the Cybersecurity and Infrastructure Security Agency (CISA)’s SBOM initiatives from August 2021 to July 2025, expressed positive sentiments about the clusters. However, he also cautioned that many elements contained within them are challenging to quantify and may be difficult to define uniformly across different organizations.
The paper further clarifies that the establishment of an SBOM for AI alone does not constitute sufficient protection against cybersecurity threats throughout the supply chain. Instead, it advocates for an integrated approach, suggesting that these SBOMs must be coupled with cybersecurity tools such as vulnerability scanning, management tools, and the dissemination of security advisories and bulletins. This holistic strategy will promote the development of adaptive and evolving security mechanisms aimed at tackling emerging threats within the AI landscape.
Collaborative Development and Global Response
The guidance has garnered contributions from several nations, including Germany’s Federal Office for Information Security (BSI), Italy’s National Cybersecurity Agency (ACN), and France’s National Cybersecurity Agency (ANSSI), alongside agencies from Canada, the United States, the United Kingdom, and Japan, in partnership with the European Commission. This collective effort signifies a global recognition of the urgent need for standardized practices in AI cybersecurity as advancements in technology continue to accelerate.
In conclusion, the establishment of standardized SBOMs for AI represents a pivotal development in safeguarding the integrity of AI supply chains. By fostering transparency and ensuring that stakeholders have access to detailed information on AI systems, these frameworks are poised to enhance security and bolster the defenses against potential cyber threats, promoting a safer technological landscape for future AI implementations.
