This update highlights key mid-year legislative and regulatory developments and builds on our first quarter update related to artificial intelligence (“AI”), connected and automated vehicles (“CAVs”), Internet of Things (“IoT”), and cryptocurrencies and blockchain developments.

I. Federal AI Legislative Developments

    In the first session of the 119th Congress, lawmakers rejected a proposed moratorium on state and local enforcement of AI laws and advanced several AI legislative proposals focused on deepfake-related harms.  Specifically, on July 1, after weeks of negotiations, the Senate voted 99-1 to strike a proposed 10-year moratorium on state and local enforcement of AI laws from the budget reconciliation package, the One Big Beautiful Bill Act (H.R. 1), which President Trump signed into law.  The vote to strike the moratorium follows the collapse of an agreement on revised language that would have shortened the moratorium to 5 years and allowed states to enforce “generally applicable laws,” including child online safety, digital replica, and CSAM laws, that do not have an “undue or disproportionate effect” on AI.  Congress could technically still consider the moratorium during this session, but the chances of that happening are low based on both the political atmosphere and the lack of a must-pass legislative vehicle in which it could be included.  See our blog post on this topic for more information.

    Additionally, lawmakers continue to focus legislation on deepfakes and intimate imagery.  For example, on May 19, President Trump signed the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (“TAKE IT DOWN”) Act (H.R. 633 / S. 146) into law, which requires online platforms to establish a notice and takedown process for nonconsensual intimate visual depictions, including certain depictions created using AI.  See our blog post on this topic for more information.  Meanwhile, members of Congress continued to pursue additional legislation to address deepfake-related harms, such as the STOP CSAM Act of 2025 (S. 1829 / H.R. 3921) and the Disrupt Explicit Forged Images And Non-Consensual Edits (“DEFIANCE”) Act (H.R. 3562 / S. 1837).

    II. Federal AI Regulatory Developments

      The Trump Administration took significant steps to advance its AI policy agenda in recent months through Executive Orders.  On July 23, the White House released its 28-page AI Action Plan, titled “Winning the Race: America’s AI Action Plan,” with 103 specific AI policy recommendations for “near-term execution by the Federal government.” The AI Action Plan is organized under the three pillars – (1) accelerating AI innovation, (2) building American AI infrastructure, and (3) leading in international AI diplomacy and security.  On July 23, President Trump also signed three Executive Orders – Executive Order 14318 on “Accelerating Federal Permitting of Data Center Infrastructure,” Executive Order 14319 on “Preventing Woke AI in the Federal Government,” and Executive Order 14320 on “Promoting the Export of the American AI Technology Stack” – to implement the AI Action Plan’s key priorities.  See our blog post for more information on these developments.  Additionally, on June 6, President Trump signed Executive Order 14306 on “Sustaining Select Efforts To Strengthen the Nation’s Cybersecurity and Amending Executive Order 13694 and Executive Order 14144,” which, according to the White House, “refocuses artificial intelligence (AI) cybersecurity efforts towards identifying and managing vulnerabilities, rather than censorship.”  

      Other parts of the Executive Branch have taken notable steps towards addressing the development and use of AI by the government and industry.  For example, in April, the White House Office of Management and Budget issued two memoranda, which outline AI risk management and procurement requirements, respectively.  Further, the Department of Commerce announced a plan on June 3 to “reform” the Biden-era U.S. AI Safety Institute to create the “Center for AI Standards and Innovation” (“CAISI”).  According to the Department’s press release, the rebranded CAISI will “serve as industry’s primary point of contact within the U.S. Government to facilitate testing and collaborative research” on commercial AI and will “represent U.S. interests internationally.”  

      III. State AI Legislative Developments

        State lawmakers proposed hundreds of AI bills in the first half of 2025, including at least 61 new state AI laws in 28 states.  Key themes include the following:

        • Comprehensive Consumer Protection Frameworks:  In Texas, the legislature passed, and Governor Greg Abbott (R) signed, the Texas Responsible AI Governance Act (“TRAIGA”) (HB 149), which will prohibit the development or deployment of AI systems with the “intent” or “sole intent” to engage in certain activities, such as to incite self-harm, among other requirements.  We discuss TRAIGA in this blog post.
        • Frontier Model Public Safety:  The New York legislature passed the Responsible AI Safety & Education (“RAISE”) Act (S6953), a frontier model public safety bill that would establish reporting, disclosure, and risk management requirements for “large developers” of frontier AI models.  Additionally, on June 17, the Joint California Policy Working Group on AI Frontier Models issued its final report on frontier AI policy, following public feedback on the draft version of the report released in March.  More on the final report can be found in this blog post.
        • Chatbot and Generative AI:  Several states enacted laws regulating the development or deployment of AI-powered chatbots and generative AI systems.  For example, Maine’s governor signed LD 1727, prohibiting the use of AI chatbots to “engage in trade and commerce with a consumer” in a manner that may mislead or deceive consumers into believing they are engaging with a human unless the use of AI is disclosed.
        • Synthetic Content and Content Moderation:  State legislatures passed over 30 new laws to regulate the creation, distribution, or use of synthetic or AI-generated content, including, for example, laws prohibiting the creation or distribution of AI-generated CSAM signed in Arkansas, Colorado, and Texas.  Various states also enacted laws prohibiting the distribution of AI-generated intimate imagery without consent, including Connecticut, Tennessee, and North Dakota.  The Governor of Texas also signed several bills into law related to age verification, take down, and consent requirements for online platforms and websites that host prohibited AI-generated content or AI tools for generating such content. 
        • AI in Healthcare:  Illinois, Nevada, Oregon, and Texas enacted legislation to regulate the use of AI in healthcare settings.  For example, the Governor of Illinois signed the Wellness and Oversight for Psychological Resources Act (HB 1806) into law, which prohibits licensed therapists from using AI to make independent therapeutic decisions, directly interact with clients for therapeutic communication, generate therapeutic recommendations or treatment plans without licensed professional review and approval, or detect emotions or mental states. 

        IV. Connected & Automated Vehicles

        Federal regulators made several significant announcements related to CAVs in recent months.  For example, on April 24, Secretary of Transportation Sean Duffy announced the National Highway Traffic Safety Administration (“NHTSA”)’s new Automated Vehicle (“AV”) Framework with the goal of enabling the growth of the AV industry and removing government barriers to innovation.  The first actions under the framework are intended to modernize the Federal Motor Vehicle Safety Standards (“FMVSS”) to enable commercial deployment of AVs.  Relatedly, Secretary Duffy announced that NHTSA will update and streamline its exemption process under Part 555 of the NHTSA’s vehicle safety regulations, with the goal of accelerating AV deployment.  Part 555 exemptions allow manufacturers to sell vehicles that do not fully comply with the FMVSS, such as vehicles without traditional steering wheels.  However, because the Part 555 exemption process has historically been challenging and time-intensive for AVs, a streamlined exemption process is expected to promote AV development and adoption.  NHTSA granted its second-ever exemption after the reforms to the Part 555 exemption process were announced.

        V. Internet of Things

        Federal regulators made several updates to regulations regarding the Internet of Things (“IoT”) in the first half of 2025.  In May 2025, the National Institute of Standards and Technology (“NIST”) announced the first revision to NIST IR 8259 Foundational Cybersecurity Activities for IoT Product Manufacturers, which proposes guidance to help clarify data management across AI components, among other topics.  Additionally, in June, NIST released an essay describing its proposed updates in the NIST SP 800-213 IoT Device Cybersecurity Guidance for the Federal Government: Establishing IoT Device Cybersecurity Requirements, which propose updates to cybersecurity risks for IoT product adoption and integration, among other updates.

        VI. Cryptocurrency & Blockchain

        U.S. lawmakers and regulators continued to focus on reshaping the cryptocurrency landscape through various legislative and regulatory efforts, underscoring bipartisan momentum on the need for a robust and durable U.S. framework for digital assets.  Notably, President Trump signed the Guiding and Establishing National Innovation for U.S. Stablecoins (“GENIUS”) Act (S. 1582) into law, which is the first federal digital asset law enacted in the United States.  The Act imposes federal prudential and customer protection standards on stablecoin issuers, authorizes and integrates state regulatory regimes for certain issuers within the federal regulatory framework, and grants regulatory agencies authority over certain aspects of issuer activity.  Congress has continued to focus other legislation on cryptocurrency, and the House passed two digital asset bills – the Digital Asset Market Clarity Act (“CLARITY Act”) (H.R. 3633) and Anti-CBDC Surveillance State Act (H.R. 1919).

        Federal regulatory developments complemented Congress’s efforts to promote cryptocurrency adoption.  For example, the Department of Justice disbanded its National Cryptocurrency Enforcement Team, and President Trump signed a Congressional Review Act repeal of the Internal Revenue Service’s decentralized finance (“DeFi”) “broker rule.”  Further, the Federal Deposit Insurance Corporation (“FDIC”) replaced prior guidance on crypto-related activities with FIL‑7‑2025, allowing FDIC‑insured banks to engage in a broad array of crypto‑related activities if internal risk management is adequate, without prior approval.  The Securities and Exchange Commission (“SEC”) continued to reassess its approach to crypto, including through public roundtables held by its Crypto Task Force.  In April, the SEC issued a policy statement clarifying that “covered stablecoins” (USD‑1:1, fully backed, redeemable) are not considered securities under federal rules.  Additionally, the Federal Reserve announced that it was discontinuing the supervisory program that it had created specifically for crypto activities conducted by banks. 

        We will continue to update you on meaningful developments in these quarterly updates and across our blogs.

        Photo of Jennifer Johnson Jennifer Johnson

        Jennifer Johnson is co-chair of the firm’s Communications & Media Practice Group.  She represents and advises broadcast licensees, trade associations, and other media entities on a wide range of issues, including:  regulatory and policy advocacy; network affiliation and other programming agreements; media joint…

        Jennifer Johnson is co-chair of the firm’s Communications & Media Practice Group.  She represents and advises broadcast licensees, trade associations, and other media entities on a wide range of issues, including:  regulatory and policy advocacy; network affiliation and other programming agreements; media joint ventures, mergers and acquisitions; carriage negotiations with cable, satellite and telco companies; media ownership and attribution; and other strategic, regulatory and transactional matters.

        Ms. Johnson assists clients in developing and pursuing strategic business and policy objectives before the Federal Communications Commission and Congress and through transactions and other business arrangements.  Her broadcast clients draw particular benefit from her deep experience and knowledge with respect to network/affiliate issues, retransmission consent arrangements, and other policy and business issues facing the industry.  Ms. Johnson also assists investment clients in structuring, evaluating and pursuing potential media investments.  She has been recognized by Best Lawyers, Chambers USA, Legal 500 USA,Washington DC Super Lawyers, and the Washingtonian as a leading lawyer in her field.

        Nicholas Xenakis

        Nick Xenakis draws on his Capitol Hill experience to provide regulatory and legislative advice to clients in a range of industries, including technology. He has particular expertise in matters involving the Judiciary Committees, such as intellectual property, antitrust, national security, immigration, and criminal…

        Nick Xenakis draws on his Capitol Hill experience to provide regulatory and legislative advice to clients in a range of industries, including technology. He has particular expertise in matters involving the Judiciary Committees, such as intellectual property, antitrust, national security, immigration, and criminal justice.

        Nick joined the firm’s Public Policy practice after serving most recently as Chief Counsel for Senator Dianne Feinstein (C-DA) and Staff Director of the Senate Judiciary Committee’s Human Rights and the Law Subcommittee, where he was responsible for managing the subcommittee and Senator Feinstein’s Judiciary staff. He also advised the Senator on all nominations, legislation, and oversight matters before the committee.

        Previously, Nick was the General Counsel for the Senate Judiciary Committee, where he managed committee staff and directed legislative and policy efforts on all issues in the Committee’s jurisdiction. He also participated in key judicial and Cabinet confirmations, including of an Attorney General and two Supreme Court Justices. Nick was also responsible for managing a broad range of committee equities in larger legislation, including appropriations, COVID-relief packages, and the National Defense Authorization Act.

        Before his time on Capitol Hill, Nick served as an attorney with the Federal Public Defender’s Office for the Eastern District of Virginia. There he represented indigent clients charged with misdemeanor, felony, and capital offenses in federal court throughout all stages of litigation, including trial and appeal. He also coordinated district-wide habeas litigation following the Supreme Court’s decision in Johnson v. United States (invalidating the residual clause of the Armed Career Criminal Act).

        Photo of Mike Nonaka Mike Nonaka

        Michael Nonaka is a partner in the firm’s Financial Institutions practice group. He represents banks and other financial institutions on a wide variety of bank regulatory, enforcement, legislative and policy issues.  Mr. Nonaka also is co-chair of the firm’s Fintech Initiative and works…

        Michael Nonaka is a partner in the firm’s Financial Institutions practice group. He represents banks and other financial institutions on a wide variety of bank regulatory, enforcement, legislative and policy issues.  Mr. Nonaka also is co-chair of the firm’s Fintech Initiative and works with a number of banks, lending companies, money transmitters, payments firms, technology companies, and service providers on innovative technologies such as big data, blockchain and related technologies, bitcoin and other virtual currencies, same day payments, and online lending.

        Photo of Jayne Ponder Jayne Ponder

        Jayne Ponder is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity Practice Group. Jayne’s practice focuses on a broad range of privacy, data security, and technology issues. She provides ongoing privacy and data protection…

        Jayne Ponder is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity Practice Group. Jayne’s practice focuses on a broad range of privacy, data security, and technology issues. She provides ongoing privacy and data protection counsel to companies, including on topics related to privacy policies and data practices, the California Consumer Privacy Act, and cyber and data security incident response and preparedness.

        August Gweon

        August Gweon counsels national and multinational companies on data privacy, cybersecurity, antitrust, and technology policy issues, including issues related to artificial intelligence and other emerging technologies. August leverages his experiences in AI and technology policy to help clients understand complex technology developments, risks…

        August Gweon counsels national and multinational companies on data privacy, cybersecurity, antitrust, and technology policy issues, including issues related to artificial intelligence and other emerging technologies. August leverages his experiences in AI and technology policy to help clients understand complex technology developments, risks, and policy trends.

        August regularly provides advice to clients for complying with federal, state, and global privacy and competition frameworks and AI regulations. He also assists clients in investigating compliance issues, preparing for federal and state privacy regulations like the California Privacy Rights Act, responding to government inquiries and investigations, and engaging in public policy discussions and rulemaking processes.

        Photo of Conor Kane Conor Kane

        Conor Kane advises clients on a broad range of privacy, artificial intelligence, telecommunications, and emerging technology matters. He assists clients with complying with state privacy laws, developing AI governance structures, and engaging with the Federal Communications Commission.

        Before joining Covington, Conor worked in…

        Conor Kane advises clients on a broad range of privacy, artificial intelligence, telecommunications, and emerging technology matters. He assists clients with complying with state privacy laws, developing AI governance structures, and engaging with the Federal Communications Commission.

        Before joining Covington, Conor worked in digital advertising helping teams develop large consumer data collection and analytics platforms. He uses this experience to advise clients on matters related to digital advertising and advertising technology.

        Max Larson

        Max Larson is an associate in the firm’s Washington, DC office. She is a member of the Technology and Communications Regulation Practice Group.

        Photo of McCall Wells McCall Wells

        McCall Wells is an associate in the firm’s San Francisco office. Her practice focuses on matters related to technology transactions and technology regulation.

        McCall also maintains an active pro bono practice, with a particular focus on immigration law. She also advises nonprofit companies…

        McCall Wells is an associate in the firm’s San Francisco office. Her practice focuses on matters related to technology transactions and technology regulation.

        McCall also maintains an active pro bono practice, with a particular focus on immigration law. She also advises nonprofit companies on corporate governance and IP concerns.

        McCall earned her J.D. from the Georgetown University Law Center, where she was a Global Law Scholar and student attorney in the Communications & Technology Law Clinic. Prior to joining the firm, McCall was a fellow at a trade association focused on developing responsible regulation for digital assets. She has experience advocating on behalf of technology companies before state and federal agencies.

        Photo of Grace Howard Grace Howard

        Grace Howard is an associate in the firm’s Washington, DC office. She represents and advises clients on a range of cybersecurity, data privacy, and government contracts issues including cyber and data security incident response and preparedness, regulatory compliance, and internal investigations including matters…

        Grace Howard is an associate in the firm’s Washington, DC office. She represents and advises clients on a range of cybersecurity, data privacy, and government contracts issues including cyber and data security incident response and preparedness, regulatory compliance, and internal investigations including matters involving allegations of noncompliance with U.S. government cybersecurity regulations and fraud under the False Claims Act.

        Prior to joining the firm, Grace served in the United States Navy as a Surface Warfare Officer and currently serves in the U.S. Navy Reserve.