On February 27, 2025, the Court of Justice of the European Union (“CJEU”) issued a significant decision on the right of data subjects to request access to their personal data under Article 15 GDPR, specifically as it relates to automated decision-making and striking an appropriate balance between informing data subjects and protecting trade secrets (Case C‑203/22).
Background
The case arose when an individual (“CK”) was denied a mobile phone contract because the mobile network operator determined that they did not have sufficient financial creditworthiness. In evaluating CK’s creditworthiness, the mobile network operator relied upon an automated credit assessment provided to it by Dun & Bradstreet Austria GmbH (“D&B”). After CK complained to the Austrian data protection authority, D&B was ordered to disclose to CK “meaningful information about the logic involved” in the automated decision under Article 15(1)(h) GDPR. D&B responded to the request, but withheld certain information on the basis that it was protected as a trade secret, and as such, disclosure was not required. D&B based this response on a provision in Austrian law that excluded the data subject’s right of access where access would compromise a business or trade secret of a third party.
CK challenged this refusal, arguing that D&B had failed to provide the information that it was required to disclose under Article 15 GDPR. The case reached the Vienna Administrative Court, which made a reference to the CJEU, requesting clarification on a series of questions regarding the extent of a data subject’s right to access information under GDPR as it relates to automated-decision making, as well as the circumstances under which a controller can limit the response it provides to a data subject’s request to access information where such discosure would reveal trade secrets.
Key Findings of the CJEU
Scope of “Meaningful Information” Under Article 15(1)(h) GDPR
The CJEU ruled that in cases involving automated decision-making (including profiling), data controllers must provide concise, transparent, intelligible, and easily accessible explanations of “the procedure and principles actually applied”, by automated means, to the data subject’s personal data to arrive at a specific result (e.g., as in this case, the generation of a credit profile of the data subject). The CJEU states that the description must allow the data subject to understand which of his or her personal data has been used in the automated decision-making, but does not provide any further detail on how controllers should describe “the procedure and principles”.
The CJEU held that any explanations provided to data subjects must clearly outline which personal data was used by the controller and how it was used. In cases involving profiling, it may suffice for a controller to explain how the result may have differed if there was a variation of the types of personal data taken into account as part of the automated decision-making process.
The CJEU also emphasized data controllers could not satisfy Article 15(1)(h) GDPR through the “mere communication of a complex mathematical formula, such as an algorithm, or by the detailed description of all the steps in automated decision-making” to the data subject, as this would not “constitute a sufficiently concise and intelligible explanation”.
Balancing Data Subject Rights and Trade Secrets
In addressing the tensions between a data subject’s rights and the commercial interests and proprietary rights of the controller, the CJEU noted that the right to the protection of personal data is not absolute, and must be proportionately balanced against other fundamental rights. This requirement to balance rights cannot, however, justify a blanket refusal by a controller to provide data subjects with access to meaningful information. Instead, the CJEU reasoned that “wherever possible”, controllers should favour any “means of communicating personal data to data subjects that do not infringe the rights or freedoms of others”.
Where a controller believes that the disclosure of information could violate trade secrets, it must submit the supposedly protected information to the competent supervisory authority or court for consideration. The competent authority or court must then balance the rights and interests at issue to determine the extent of the data subject’s rights of access under Article 15 GDPR.
Finally, the CJEU held that the GDPR precludes the creation of national rules, such as the Austrian provision relied upon by D&B in this case, that “exclude, as a rule” the right of access where it could compromise a business or trade secret. The CJEU held that Member States cannot prescribe the result of a requirement under EU law to balance rights and interests on a case-by-case basis.
Conclusion
Companies that process personal data to inform automated decision-making must ensure they can provide clear and meaningful explanations about decisions that have legal or similar effects on data subjects. This requirement to ensure the explainability of automated processes is a crucial governance requirement, which is especially pertinent in the context of the increasing popularity of commercial AI applications. Organizations will not be able to simply invoke trade secrets to deny data subjects access to information; however, they are not required to disclose the algorithm itself to data subjects.
The implications of this decision are likely to extend beyond GDPR compliance. Indeed, the guidance provided by the CJEU serves as a valuable direction for the explainability requirements outlined in the AI Act. In particular, Article 86 AI Act establishes that deployers of certain AI systems must provide “clear and meaningful explanations of the role of the AI system in the decision-making procedure and the main elements of the decision taken,” and the decision is likely to be taken into account by EU and Member State courts interpreting Article 86 in future. In any case, companies that deploy AI systems must comply with GDPR whenever personal data is involved, which should include compliance with EDPB Opinion 28/2024 on the use of personal data for the development and deployment of AI models.
* * *
Covington’s Data Privacy and Cybersecurity Practice monitors CJEU cases closely and reports on relevant Court decisions and Advocate General opinions. If you have any questions about the interaction between data protection and local laws, we are happy to assist.
(This post was written with the assistance of Alberto Vogel).