FY2026 NDAA, Section 1513 ("Physical and cybersecurity procurement requirements for artificial intelligence systems")

Proposed 2025-03-14 | Enacted 2025-12-18 | Official source

Summary

Instructs the Secretary of Defense to develop a cybersecurity framework for Department of Defense AI and machine learning technologies. Requires tailoring of security requirements in consideration of costs versus benefits and encourages collaboration with private sector and academia.

  • This summary is awaiting validation (peer review by a second AGORA editor).

Key facts

🏛️ This document has been enacted by the United States Congress. For authoritative text and metadata, visit the official source.

🎯 This document primarily applies to the government, rather than the private sector.

📜 This document's name is National Defense Authorization Act for Fiscal Year 2026, Section 1513 ("Physical and cybersecurity procurement requirements for artificial intelligence systems"). AGORA also tracks this document under the name FY2026 NDAA, Section 1513 ("Physical and cybersecurity procurement requirements for artificial intelligence systems"). It is part of FY2026 NDAA.

↳ This document is part of a longer one: FY2026 NDAA. Some AGORA documents are "split off" from longer documents that mix AI and non-AI content, such as omnibus authorization or appropriations laws in the United States Congress. Read more >>

Themes AI risks, applications, governance strategies, and other themes addressed in AGORA documents.
  • Thematic tags for this document are awaiting validation (peer review by a second AGORA editor).

Full text

  • This is an unofficial copy. The document has been archived and reformatted in plaintext for AGORA. Footnotes, tables, and similar material may be omitted. For the official text, visit the original source.
  • Thematic tags for this document are awaiting validation (peer review by a second AGORA editor).
SEC. 1513. PHYSICAL AND CYBERSECURITY PROCUREMENT REQUIREMENTS FOR ARTIFICIAL INTELLIGENCE SYSTEMS. (a) Security Framework.-- (1) In general.--The Secretary of Defense shall develop a framework for the implementation of cybersecurity and physical security standards and best practices relating to covered artificial intelligence and machine learning technologies to mitigate risks to the Department of Defense from the use of such technologies.
(2) Coverage of relevant aspects of security.--The framework developed under paragraph (1) shall cover all relevant aspects of the security of artificial intelligence and machine learning systems of the Department of Defense, including the following: (A) Risk posed to and by the workforce of the Department of Defense, including insider threat risks. (B) Training and workforce development requirements, including with respect to the following: (i) Artificial intelligence security awareness. (ii) Artificial intelligence-specific threats and vulnerabilities. (iii) Development of a continuum of professional development and education of artificial intelligence security expertise. (C) Risks to the supply chains of such systems, including counterfeit parts or data poisoning risks. (D) Risks relating to adversarial tampering with artificial intelligence systems. (E) Risks relating to the unintended exposure or theft of artificial intelligence systems or data. (F) Security posture management practices, including governance of security measures, continuous monitoring, and incident reporting procedures. (G) An evaluation of commercially available platforms for continuous monitoring and assessment of such systems.
(3) Risk-based framework.--The framework developed under paragraph (1) shall be risk-based, including security that is proportional to the national security or foreign policy risks posed by the covered artificial intelligence and machine learning technology being stolen or tampered with. (4) Use of existing frameworks.--To the maximum extent feasible, the framework developed under paragraph (1) shall-- (A) draw on existing cybersecurity reference documents, including the NIST Special Publication 800 series; and (B) be implemented as an extension or augmentation of existing cybersecurity frameworks developed by the Department of Defense, including the Cybersecurity Maturity Model Certification framework.
(5) Addressing extreme security risks.-- (A) Highly capable cyber threat actors.--The framework developed under paragraph (1) shall prioritize the most highly capable artificial intelligence systems that may be of highest interest to cyber threat actors, based on risk assessments and threat reporting. (B) Security levels.--The Secretary shall ensure that the framework developed under paragraph (1) imposes requirements for security on contractors that are designed to mitigate the cyberesecurity risks posed by the cyber threat actors described in subparagraph (A), with the most stringent security requirements under such frameworks providing protection that is similar to the protection offered by national security systems (as defined in section 3552(b)(6) of title 44, United States Code). (C) General design with specific components.--To the extent feasible, any additional security requirements developed pursuant to subparagraph (B) shall be designed generally for all software systems of the Department of Defense, but may contain components designed specifically for highly capable artificial intelligence systems.
(b) Security Requirements.-- (1) In general.--The Secretary of Defense shall amend the Defense Federal Acquisition Regulation Supplement, or take other similar action, to require covered entities to implement the best practices described in subsection (a) under the framework developed under such subsection. (2) Risk-based rules.--Any requirements implemented pursuant to paragraph (1) shall, to the extent practicable, be narrowly tailored to the specific covered artificial intelligence and machine learning technologies developed, deployed, stored, or hosted by a covered entity, and shall be calibrated accordingly to the different tasks involved in development, deployment, storage, or hosting of components of such covered artificial intelligence and machine learning technologies.
(3) Cost-benefit consideration.-- (A) In general.--In carrying out paragraph (1), the Secretary of Defense shall-- (i) consider the costs and benefits to the Department of Defense and to the national security and technological leadership of the United States, of imposing security requirements on covered entities; and (ii) to the extent feasible, design the requirements implemented pursuant to such paragraph to allow for trade space analysis by the Department in a transparent manner between competing requirements in order to minimize the costs and maximize the benefits of such requirements. (B) Weighing costs of slowing down development.--In carrying out subparagraph (A), the Secretary shall weigh the costs of slowing the development and deployment of artificial intelligence and machine learning against the benefits of mitigating national security risks and potential security risks to the Department of Defense from using commercial software for imposing additional physical or cybersecurity requirements for such systems.
(c) Private Sector Collaboration.--In carrying out the requirements of subsection (a), the Secretary of Defense shall seek to collaborate with industry and academia in the development of the framework under such subsection using a process for consultation that uses a new or existing mechanism for public-private partnerships. (d) Implementation Plan.--The framework required by subsection (a)(1) shall include a detailed plan for the implementation of the framework that-- (1) establishes timelines and milestones for achieving the objectives outlined in the framework; (2) identifies resource requirements and funding mechanisms; and (3) provides metrics for measuring progress and effectiveness. (e) Reporting Requirements.--Not later than 180 days after the date of the enactment of this Act, the Secretary shall submit to the congressional defense committees an update on the status of implementation of the requirements of this section.
(f) Definitions.--In this section: (1) The term ``artificial intelligence'' has the meaning given such term in 238(g) of the John S. McCain National Defense Authorization Act for Fiscal Year 2019 (Public Law 115-232; 10 U.S.C. 4061 note prec.). (2) The term ``covered artificial intelligence and machine learning technology'' means an artificial intelligence or machine learning system acquired by the Department of Defense or an element of the Department and all associated components involved in the development and deployment lifecycle of such system, including source code, numerical parameters (including model weights) of the trained artificial intelligence or machine learning system, details of any methods and algorithms used to develop such system, data used in the development of such system, and software used for evaluating the trustworthiness of the artificial intelligence or machine learning system during development or deployment. (3) The term ``covered entity'' means an entity that enters into a contract or other agreement with the Department of Defense under which such entity engages in the development, deployment, storage, or hosting of one or more covered artificial intelligence and machine learning technologies.