Editor’s Note: Europe’s “Digital Omnibus” signals a shift in how compliance may be operationalized—reducing some external process requirements while increasing the need for internal, evidence-ready governance. In their Joint Opinion adopted 10 February 2026, the EDPB and EDPS support the Commission’s aim to cut administrative burden, but they highlight issues that will matter immediately to cybersecurity, privacy, regulatory compliance, and eDiscovery teams: a potentially narrower interpretation of “personal data,” emerging expectations around automated consent signals, and Omnibus-linked adjustments that could influence AI Act timelines and accountability.
For practitioners, the key issue isn’t “simplification,” but defensibility. If identifiability becomes more controller-specific, datasets treated as anonymous or obfuscated at collection could move in and out of scope across the data lifecycle—complicating legal holds, transfer assessments, and downstream production decisions. Meanwhile, standardized breach and DPIA templates, along with any AI-related streamlining, are likely to increase scrutiny of whether your documentation, inventories, LIAs, and preservation practices can withstand supervisory review or litigation—especially where training-data provenance and bias-testing safeguards are in play.
Industry News – Data Privacy and Protection Beat
EDPB and EDPS Weigh In on the Digital Omnibus: Personal Data, Breach Reporting, and AI Governance
ComplexDiscovery Staff
The European Union’s move to modernize its digital legal framework is currently centered on the Digital Omnibus, a legislative package aimed at reducing administrative burdens and enhancing the continent’s economic competitiveness. In early 2026, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) issued a joint opinion evaluating these proposals. While the regulators support the objective of streamlining compliance, they have identified several areas where the proposed changes could conflict with existing data protection standards, particularly regarding the definition of personal data and the transparency of automated systems.
Redefining the Boundaries of Personal Data
A central point of concern for information governance and eDiscovery professionals is the proposed reinterpretation of “personal data.” The Commission suggests moving toward an entity‑specific test of identifiability, focusing on whether a given controller has means “reasonably likely” to identify an individual, and presents this as a clarification of existing Court of Justice of the European Union (CJEU) case law. In their joint opinion, however, the EDPB and EDPS argue that the proposal does not simply codify CJEU jurisprudence but would significantly narrow the scope of EU data protection law by redefining what constitutes personal data.
For eDiscovery teams, this shift could complicate the execution of legal holds and cross‑border data transfers. If certain datasets—such as anonymized telemetry or obfuscated log files—are reclassified as non‑personal for a given controller at the point of collection, they might bypass standard privacy‑shielding protocols during the early stages of a matter. Practitioners should evaluate their cross‑border transfer impact assessments and data‑mapping assumptions to ensure that data categorized as “anonymous” under any new rules still meets the higher GDPR bar for international litigation if identification becomes possible later in the lifecycle.
Streamlining Incident Response and Reporting
The Omnibus also proposes adjustments to the procedural aspects of cybersecurity management. The EDPB and EDPS are in favor of increasing the threshold of risk that triggers an obligation to notify a data breach to supervisory authorities and extending the deadline for such notifications, noting that this can reduce administrative burden without undermining protection for individuals. They also welcome the introduction of common EU‑level templates and lists for data breaches and data protection impact assessments as a positive step toward harmonization.
For cybersecurity incident response teams, the direction of travel points toward greater standardization in how breaches are documented and reported. While the precise legal form of the templates remains under negotiation, organizations should anticipate the need to align forensic logging and reporting so that the “artifacts of discovery”—the logs, headers, and metadata collected during investigations—can be efficiently mapped to these common formats across jurisdictions. Updating internal incident response playbooks to reflect this anticipated structure can help ensure that any extended reporting window does not lead to a degradation in the quality or admissibility of evidence preserved for regulatory or legal scrutiny.
Information Governance in the Age of AI Training
Regarding information governance and artificial intelligence, the proposal introduces a specific “legitimate interest” provision for training AI models, intended to recognize AI development more explicitly as a legitimate interest basis under Article 6(1)(f) GDPR. The EDPB and EDPS acknowledge that AI development can constitute a legitimate interest but question the necessity of adding an AI‑specific clause, pointing out that their existing guidance already confirms that AI training may be pursued on this basis, subject to the three‑step test.
For governance teams, this means that even if the Omnibus is adopted, the fundamental requirement for a rigorous legitimate interest assessment (LIA) remains unchanged. Frameworks should be updated to ensure that the provenance of training data is auditable and that opt‑out or objection mechanisms are operational, particularly in scenarios involving large‑scale web scraping or mixed‑use datasets. This documentation will be critical for responding to AI‑related access and transparency requests under the AI Act and the GDPR, especially when automated decision‑making significantly affects individuals.
Accountability for Low‑Risk AI Systems
The registration and documentation of AI systems remain areas of evolving regulatory focus under the AI Act and its proposed amendments via the Digital Omnibus. Discussions around the Omnibus include proposals to streamline obligations for AI systems that are not classified as high‑risk, potentially reducing external registration or notification burdens for certain categories of tools.
From a governance perspective, this trend underscores the need for robust internal oversight of “shadow AI.” If formal registration requirements are relaxed for lower‑risk systems, the internal burden of proof regarding risk classification and lifecycle controls will increase. Maintaining a comprehensive internal inventory of all AI assets—regardless of perceived risk—will be essential to defending classification decisions in the event of supervisory inquiries or litigation.
Data Processing for Bias Detection
Finally, the Omnibus addresses the processing of special categories of personal data—such as health or ethnic information—for the purpose of bias detection and correction in AI models. The regulators recognize that undetected bias in AI systems can pose serious risks, but stress that processing special‑category data remains, in principle, prohibited, and that any exception for bias monitoring must be narrowly circumscribed, strictly necessary, and subject to enhanced safeguards.
For eDiscovery and compliance professionals conducting internal bias audits, this creates a narrow but strategically important legal pathway. Practitioners should ensure that any use of sensitive data for bias testing is siloed from general processing, supported by clear legal justification, and that the data is purged once corrective measures are validated. This purpose‑limited processing demands strict access controls, minimization measures, and a detailed audit trail to prevent sensitive data from being repurposed for unauthorized secondary uses, which could attract heightened regulatory scrutiny and sanctions.
Timelines, AI Act Deferrals, and Compliance Burden
As the legislative process enters trilogue negotiations, the focus remains on balancing the pace of digital innovation with the integrity of the European privacy and fundamental‑rights model. Discussions around the Digital Omnibus include proposals to link the application of certain high‑risk AI obligations under the AI Act to the availability of harmonized standards, which could, in practice, push some compliance deadlines beyond 2026, with possible long‑stop dates in the 2027–2028 timeframe still under negotiation. The Commission is therefore pushing for timely adoption so that these adjustments can take legal effect before existing deadlines trigger a concentrated compliance crunch.
For those managing data and AI at scale, the primary takeaway is that “simplification” often shifts the locus of compliance from external reporting toward internal documentation and governance. Even where notification thresholds are raised, or registration obligations are relaxed, organizations can expect greater emphasis on demonstrable accountability—through records of processing, data and model inventories, legitimate interest assessments, and evidence that technical and organizational measures are calibrated to the evolving EU standard.
News Sources
- Digital Omnibus: EDPB and EDPS support simplification and competitiveness while raising key concerns (EDPB)
- EU Regulators Issue Opinion on Revisions of GDPR and Other Data Laws (Inside Privacy)
- EU Privacy Watchdogs Pan Digital Omnibus (BankInfoSecurity)
- EDPB, EDPS detail concerns over personal data definition in joint opinion (IAPP)
- The 2025 European Commission EU digital omnibus package (Kennedys)
Assisted by GAI and LLM Technologies
Additional Reading
- EU’s Preliminary DSA Findings Put TikTok’s Engagement Design in the Regulatory Crosshairs
- Market Reaction or Overreaction? Anthropic’s Legal Plugin and the Facts So Far
- Moltbook and the Rise of AI-Agent Networks: An Enterprise Governance Wake-Up Call
- From One-Eyed Kings to Collective Sight in Enterprise AI
Source: ComplexDiscovery OÜ

ComplexDiscovery’s mission is to enable clarity for complex decisions by providing independent, data‑driven reporting, research, and commentary that make digital risk, legal technology, and regulatory change more legible for practitioners, policymakers, and business leaders.
The post EDPB and EDPS Weigh In on the Digital Omnibus: Personal Data, Breach Reporting, and AI Governance appeared first on ComplexDiscovery.