Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

Connecticut Enacts Significant Amendments to State’s Data Privacy Law

By David Stauss, Shelby Dolen & Marlaina Pinto on June 26, 2025
Email this postTweet this postLike this postShare this post on LinkedIn
State of Connecticut Waving Flag in 3D

Keypoint: Connecticut once again moves the needle on state privacy laws while at the same time integrating changes from other state laws.

On June 25, Connecticut Governor Lamont signed Senator James Maroney’s SB 1295 into law. The bill makes several notable changes to Connecticut’s existing consumer data privacy law, including modifying its applicability standard, exemptions, definitions, consumer rights, data minimization provisions, and children’s privacy sections. The bill also significantly modifies the law’s approach to profiling that will impact the use of artificial intelligence in some contexts.

In the below post, we provide a summary of the more notable changes. For each of the changes, we also provide the context for the change, including what the change means, its potential consequences, and how it fits into the larger landscape of state data privacy laws.

Background

The enactment of SB 1295 marks the second time Connecticut’s law has undergone significant revisions in the three years since it was first passed in 2022. In 2023, the law was amended to add consumer health and children’s privacy provisions. Of note, the relevant bill number changed on the final day of the session. Throughout the legislative cycle, we tracked SB 1356, which passed the Senate on May 16. However, Connecticut bills are not typically amended once they pass a chamber and SB 1356 required several changes after passing the Senate. Therefore, SB 1295 was amended to add the data privacy provisions and passed the Senate and House on June 3.

Enforcement

Perhaps one of the most significant developments is not contained in the bill at all. As part of the budgeting process, Senator Maroney secured an increase in funding for the Attorney General’s office to hire more enforcers. 

Applicability

What Changed

The law’s applicability standard changes in three ways. First, the consumer threshold for applicability drops from 100,000 to 35,000 or 0.95% of Connecticut’s 3.675 million population.

Second, the bill deletes the applicability threshold for persons that control or process the personal data of at least 25,000 consumers and derive more than 25% of their gross revenue from the sale of personal data. Instead, the law applies to persons that “offer consumers’ personal data for sale in trade or commerce.”

Third, entities that control or process consumers’ sensitive data are subject to the law unless that data is used solely for purposes of completing a payment transaction (a necessary exception given the expansion of the definition of sensitive data discussed below).

What’s the Context

Connecticut was the fifth state to pass a consumer data privacy law in 2022. When that bill was first introduced in 2022, the threshold was set at 65,000 consumers to reflect Connecticut’s smaller population; however, it was moved up to 100,000 consumers before final passage. Language also was added to exclude personal data controlled or processed solely for the purpose of completing a payment transaction. These changes were made to address concerns that the law would unnecessarily burden small businesses.

It was not until Montana passed its privacy law the following year that a state adjusted the consumer threshold to account for having a smaller state population. Montana lowered the threshold to 50,000 (and this year adjusted it down to 25,000 consumers). Since Montana enacted its law, Maryland, Delaware, New Hampshire, and Rhode Island have enacted laws with a lower 35,000 consumer threshold.

The extension of applicability to entities that control or process sensitive data or sell personal data is notable and unique. Texas and Nebraska’s laws apply to entities that sell personal data and are not small businesses while Rhode Island’s privacy law has a unique applicability section as we explain here. No state has structured its applicability like Connecticut.

Companies and privacy professionals will need to think through what it means to “offer consumers’ personal data for sale in trade or commerce.” For example, would the use of online tracking technologies (a sale under the CCPA) alone trigger the law’s applicability?

Exemptions

What Changed

The bill removes the entity-level exemption for GLBA regulated entities but keeps the data-level exemption (moving it to the next paragraph of the section). The entity-level exemption is replaced with exemptions for insurers, health carriers, insurance-support organizations, banks, credit unions and agents, broker-dealers, and investment advisers or their agents. There is qualifying language for each of those exemptions, so it will be important for organizations to review the specific wording when determining if they are exempt. The bill also adds an exemption for political activities. Of note, earlier drafts of the bill removed the exemption for nonprofits; however, that change was deleted in the final bill.

What’s the Context

The change to the GLBA exemption is similar-in-kind to the laws in Oregon and Minnesota as well as an amendment made to Montana’s law this year. The GLBA entity-level exemption has always been a source of friction. Colorado Senate Majority Leader Robert Rodriguez has repeatedly stated that he regrets including it when drafting the Colorado Privacy Act. California remains the only state with just a data-level GLBA exemption. The trend towards limiting the entity-level exemption to entities subject to state financial laws started with Oregon. The concern there was that car dealerships were entirely exempt just because they had limited operations subject to GLBA (i.e., offering car loans). FinTech companies also are potentially in-scope with the change.

The addition of an exemption for political organizations follows a somewhat similar amendment made to Virginia’s law that was also adopted in Texas’ law. The concern from political organizations is that they (and their vendors) were not supposed to be subject to consumer privacy laws but that has become ambiguous over time.

Consumer Rights

What Changed

The bill modifies the consumer rights section in several ways.

First, the right to access is amended to include the right to access “inferences about the consumer derived from personal data.” That arguably already is included in the right to access but now it is explicit.

The right to access also is amended to include “whether a controller or processor is processing a consumer’s personal data for the purposes of profiling to make a decision that produces any legal or similarly significant effect concerning a consumer.” We discuss this and other changes to the law’s profiling provisions below.

Further, the law was amended to provide that a controller is forbidden from disclosing certain types of personal data in response to an access request and instead should inform the consumer that the controller has collected such personal data. Those data elements are (1) the consumer’s Social Security number; (2) the consumer’s driver’s license number, state identification card number or other government-issued identification number; (3) the consumer’s financial account number; (4) the consumer’s health insurance identification number or medical identification number; (5) the consumer’s account password; (6) the consumer’s security question or answer thereto; and (7) the consumer’s biometric data.

Finally, the bill adds the right for consumers to “obtain from the controller a list of the third parties to which such controller has sold the consumer’s personal data or, if such controller does not maintain a list of the third parties to which such controller has sold the consumer’s personal data, a list of all third parties to which the controller has sold personal data, provided the controller shall not be required to reveal any trade secret.”

What’s the Context

Limiting the right to access by prohibiting the disclosure of certain types of data is necessary to prevent data breaches. This limitation was first included in the CCPA’s regulations when they were drafted years ago by the Attorney General’s office and it can currently be found at regulation 7024(d). In the rulemaking comments, the Attorney General’s office explained that the risk of using a right to access to perpetrate a data breach outweighed the need to provide this type of information to consumers and, instead, a business could just disclose that it has the information. Stated differently, there really is no need to give someone their Social Security number in response to a request to access given that is something they already know/have.

This prohibition also was incorporated into the Colorado Privacy Act rules at Rule 4.04E. However, it was not until Minnesota passed its consumer data privacy law last year that a state specifically included it in its statute. This year, Montana and Connecticut joined Minnesota in codifying the prohibition. Over the coming years, hopefully more states will add this common sense provision that benefits both consumers and companies.

Finally, the addition of the consumer right to obtain a list of third parties to whom the controller has sold the consumer’s personal data is inspired by similar provisions in Oregon and Minnesota’s laws.

Sensitive Data

What Changed

The bill expands the law’s definition of sensitive data to include (1) disability or treatment; (2) status as nonbinary or transgender; (3) neural data; (4) a consumer’s financial account number, financial account log-in information or credit card or debit card number that, in combination with any required access or security code, password or credential, would allow access to a consumer’s financial account; and (5) government-issued identification number, including, but not limited to, Social Security number, passport number, state identification card number or driver’s license number, that applicable law does not require to be publicly displayed. The bill defines “neural data” as “any information that is generated by measuring the activity of an individual’s central nervous system.”

The amendment also modifies the biometric and genetic information section by removing the requirement that such information be used to uniquely identify an individual and adding the phrase “or information derived therefrom.” That said, the definition of biometric data still requires that it be used to identify individuals. Ultimately, the law’s approach to biometric data remains unchanged.

Finally, the law prohibits controllers from selling sensitive data without a consumer’s consent.

What’s the Context

This is the second time Senator Maroney has expanded the definition of sensitive data in the three years since the law passed. In 2023, Senator Maroney added consumer health data and data concerning an individual’s status as a victim of crime. The addition of consumer health data was in response to the U.S. Supreme Court overturning Roe v. Wade.

The most recent changes update the law to reflect data elements recently added in Oregon, Colorado, Delaware, Maryland, and New Jersey. Of note, while the bill adds financial information, it is more restrictive than New Jersey’s law, which states “financial information, including.” New Jersey’s approach unnecessarily makes all financial information sensitive whereas Connecticut’s change is more narrowly tailored to apply to specific types of financial information.

The bill’s definition of neural data also avoids the problem created by Colorado’s 2024 amendment to the Colorado Privacy Act, which requires neural data be used to identify individuals. That qualification has rendered the Colorado amendment superfluous.

The continual expansion of the definition of sensitive data is one way in which U.S. state privacy law has outpaced GDPR where Article 9’s opt-in requirement is now much narrower than U.S. privacy law. On the other hand, it could be argued that the continual expansion of the definition of sensitive data is slowly turning U.S. privacy law into a consent-based regime.

Data Minimization

What Changed

The law’s data minimization language now provides that a controller shall limit “the collection of personal data to what is reasonably necessary and proportionate in relation to the purposes for which such data are processed.” The previous language said that the collection must be limited to what is “adequate, relevant and reasonably necessary” to the purposes.

Further, the law now specifies that, absent consent, controllers cannot process consumers’ personal data for “any material new purpose that is neither reasonably necessary to, nor compatible with, the purposes that were disclosed to the consumer.” Borrowing concepts from California regulation 7002, the law goes on to explain that controllers must take into account “(i) the consumer’s reasonable expectation regarding such personal data at the time such personal data were collected based on the purposes that were disclosed to the consumer . . . , (ii) the relationship that such new purpose bears to the purposes that were disclosed to the consumer . . . , (iii) the impact that processing such personal data for such new purpose might have on the consumer, (iv) the relationship between the consumer and the controller and the context in which the personal data were collected, and (v) the existence of additional safeguards, including, but not limited to, encryption or pseudonymization, in processing such personal data for such new purpose.”

Finally, the law tweaks the sensitive data provision. Controllers still need to obtain consent to collect sensitive data but their processing of sensitive data also must be “reasonably necessary in relation to the purposes for which such sensitive data are processed.”

What’s the Context

For the past two legislative sessions data minimization has become a hot button issue. Privacy advocates believe consumer data privacy laws place too much responsibility on consumers to read disclosures and, if applicable, consent to processing activities and not enough responsibility on companies to limit their collection and processing activities to what consumers reasonably expect. Last year, Maryland’s privacy law created a new data minimization structure, requiring (for example) the processing of sensitive data to be strictly necessary for a product or service requested by a consumer and rejecting the prevailing consent-based approach. However, since its passage, that approach has been the subject of extensive criticism.

The Connecticut amendments try to address this issue in two ways. First, the collection language is slightly changed such that the collection has to be necessary and proportionate to the purposes for which the data are processed as disclosed to the consumer. The use of the word proportionate is notable here as it means that controllers need to collect what they need for the purposes identified in their notices although, arguably, this was already reflected in the existing language. Ultimately, this means that controllers need to be careful and thoughtful about what they say in their privacy notices.

The real change is made to the secondary purposes section. In that regard, if controllers want to process data for “any material new purpose” that are not necessary or compatible with the purposes disclosed to the consumer, they must obtain consent or ensure that those purposes are, for example, consistent with consumer expectations, the relationship with the consumer, and the context of the collection. In so doing, the Connecticut law borrows concepts from California’s regulation 7002.

The Connecticut amendments also try to address the concern with relying on consumer consent for the processing of sensitive data by adding the requirement that the processing also must be “reasonably necessary in relation to the purposes for which such sensitive data are processed.” In so doing, the amendments attempt to create a standard that controllers can understand rather than using terms such as “strictly necessary” that are so ambiguous they arguably could never be enforced. It also should be kept in mind that sensitive data is personal data such that those data minimization requirements apply to the processing of sensitive data. In the end, the amendments provide Connecticut’s attorney general with ample language to enforce against controllers the office believes are engaging in inappropriate data collection and processing activities.

Ultimately, however, this issue is certain to linger as other states continue to look at data minimization language. For those wanting even more context, the Future of Privacy Forum’s Jordan Francis recently published an article providing extensive background and analysis on the issue.

Profiling, AI, and Impact Assessments

What Changed

As a starting point, the pre-amended law allowed consumers to opt out of “profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer.” The law defined “legal or similarly significant effects concerning the consumer” as “decisions made by the controller that result in the provision or denial by the controller of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment opportunities, health care services or access to essential goods or services.”

It also is important to understand the relationship between profiling, automated decision-making, and artificial intelligence. The California Privacy Protection Agency addressed this in its automated decision-making technology (ADMT) rulemaking, stating: “ADMT is technology that makes decisions, or that a person relies upon to make a decision. It includes “profiling,” which generally refers to evaluating consumers by automated means (e.g., using technology to analyze their personality, interests, behavior, or location). Artificial intelligence (AI) can be ADMT, but not all AI is ADMT.” Stated differently, profiling includes (but is not limited to) the use of AI to make decisions such as financial lending, employment opportunities, or health care services.

With that background, we turn to the amendments. The first change is that the word “solely” is replaced by the word “any.” In other words, the pre-amended law only allowed an opt out when a human is not involved, i.e., solely automated decisions; whereas, the amended law broadens the right to include instances when humans are involved, i.e., any automated decisions. This change aligns Connecticut with the Colorado approach to profiling. It also is the reason why the Colorado Privacy Act rules create a structure where the role of human involvement (e.g., no involvement, rubber-stamping an automated decision, or having the ability to reject an automated decision) is tethered to what types of disclosures and rights controllers need to provide to consumers.

Second, the definition of legal or similarly significant effect is revised to include “any decision made by the controller, or on behalf of the controller.” This clarifies that the law applies to not only final decisions made by the controller but also to pipeline decisions – i.e., decisions leading up to the final decision, including decisions made by processors that are relied on by the controller. Whether the right to opt out applies to pipeline decisions was an issue raised by stakeholders during the Colorado Privacy Act rulemaking, but it was never addressed in the final rules.

Further, the definition removes “access to essential goods or services.” That is a phrase that has traditionally received pushback for being ambiguous. For example, is the selling of clothing or food an essential good or service?

Third, the right to access is expanded to include “whether a controller or processor is processing a consumer’s personal data for the purposes of profiling to make a decision that produces any legal or similarly significant effect concerning a consumer.”

Fourth, if a consumer’s personal data is used for covered profiling, consumers have a new right to, if feasible, “(A) question the result of such profiling, (B) be informed of the reason that such profiling resulted in such decision, (C) review the consumer’s personal data that were processed for the purposes of such profiling, and (D) if the profiling decision concerned housing, taking into account the nature of the personal data and the purposes for which such personal data were processed, allow the consumer to correct any incorrect personal data that were processed for the purposes of such profiling and have the profiling decision reevaluated based on the corrected personal data.”

This language is inspired by language in Minnesota’s privacy law, but the two provisions are not identical. Minnesota’s language says that consumers have the right to be informed of what actions the consumer might have taken to secure a different decision and the actions that the consumer might take to secure a different decision in the future. Minnesota’s language also extends the right to correction to all profiling, not just housing.

Another related change is to the privacy notice provisions, where Senator Maroney added a requirement that controllers must provide “a statement disclosing whether the controller collects, uses or sells personal data for the purpose of training large language models.” In prior drafts, this was extended to the use of AI generally. However, that language was too broad so it was limited to large language models.

In addition, controllers that engage in profiling for the purpose of making a decision that produces any legal or similarly significant effect concerning consumers are required to conduct an impact assessment. That impact assessment must consider, among other things, the intended use cases and deployment context and benefits of profiling, whether the profiling poses any foreseeable heightened risks of harm to consumers and, if so, steps taken to mitigate that risk, the inputs and outputs the profiling processes and produces, and whether the controller discloses the use of profiling to consumers. This language is taken from Senator Maroney’s algorithmic discrimination bill which sought to create this obligation on deployers of high-risk AI systems. Further, processors are required to provide the information necessary for controllers to complete impact assessments and the attorney general can request them just like it can request data protection assessments. The bill also adds requirements for conducting impact assessments when engaging in profiling involving minors’ personal data.

Finally, the law includes new language directed at allowing controllers to conduct bias auditing and encourages controllers to engage in such bias auditing by allowing them to use it as a defense against claims that they violated state laws that prohibit unlawful discrimination against consumers. These provisions are necessary because the types of data that are used in bias auditing can be sensitive under the Connecticut privacy law (e.g., race, disability, and sexual orientation) and, therefore, require consent for processing. Consequently, these new provisions allow controllers to use this sensitive data without obtaining consumer consent to ensure that their products and services do not discriminate against individuals.

What’s the Context

At core, these modifications are an attempt to augment the law’s rights and obligations regarding profiling and, functionally, regulate AI through the state’s privacy law instead of through a separate AI-specific law. That said, the right to opt out of profiling has always been an unfulfilled promise in state privacy laws because the vast majority of entities that use and/or personal data that are used in profiling are exempt from the laws. For example, the Connecticut law allows consumers to opt out of profiling for employment opportunities, but the law exempts employee data. Consumers also can opt out of profiling for financial or lending services, but the law (as previously written) contains both GLBA entity and data level exemptions. The same is true for insurance, education, criminal justice, and health care. Housing and essential goods or services are likely the only profiling processing activities that are not exempt (although the amendments remove essential goods or services).

Arguably, with the bill’s change to its GLBA entity-level exemption, there could be more activities engaged in by FinTech companies that are now subject to the right to opt out of profiling. This will heavily depend on how those entities are regulated and what types of data they use in profiling. Housing also remains subject to the right to opt out of profiling and entities operating in that area will need to examine whether they have new obligations. Further, with the broadening of the law to pipeline decisions, it is possible that more types of processing activities will be covered.

Finally, as noted, the right to opt out of profiling also extends to decisions that result in the provision or denial by the controller of “any health care service.” The law contains entity-level exemptions for HIPAA covered entities and business associates as well as HIPAA personal health information and other data level health exemptions. That said, the law was amended in 2023 to regulate “consumer health data.” Ultimately, the law does not define “health care service” and, therefore, it is possible the right to opt out of profiling could be interpreted more broadly than traditional health-related data covered by HIPAA and extend to adjacent health care services such as those covered by Washington’s My Health My Data Act. This is another gray area that will need to be explored closely by entities in that space.

Privacy Notices

What Changed

The amendments first clean up the privacy notice disclosure requirements by integrating language that appears later in the law regarding the disclosure of consumer rights in the privacy notice and disclosure of selling and targeted advertising activities. The amendments also add requirements to include the most recent month and year when the privacy notice was updated and, as previously discussed, include a statement disclosing whether the controller collects, uses or sells personal data for the purposes of training large language models.

Further, the amendments remove the word “shares” and replace it with the word “sells.” The word “shares” was undefined and ambiguous and also created interoperability issues with California’s use of the word in the CCPA.

The amendments also make two changes to facilitate interoperability with other state privacy laws. First, language was added to clarify that the privacy notice should be provided through a hyperlink that includes the word “privacy” on the controller’s homepage or, for apps, on the app store and settings menu. The notice also must be provided in each language in which the controller does business and be accessible to individuals with disabilities.

Second, the amendments provide that the law does not require a controller “to provide a privacy notice that is specific to this state if the controller provides a generally applicable privacy notice that satisfies the” law’s requirements.

Finally, the law adds a requirement that if a controller makes any retroactive material change to the controller’s privacy notice or practices it must notify consumers for any personal data to be collected after the effective date of the material change and, for data collected prior to the change, provide an opportunity for consumers to “withdraw consent to any further and materially different collection, processing or transfer of previously collected data.”

What’s the Context

The interoperability changes are intended to ensure that controllers do not need to provide any special links on their websites or have separate sections of their privacy notices that are specific to Connecticut residents. The language regarding changes to privacy practices is taken from Minnesota’s law, and a similar change was made in Montana’s law this year. However, controllers should keep in mind FTC guidance on making privacy notice changes as well as Colorado’s requirements.

Minor’s Privacy Provisions

Under the pre-amended law, a controller needed to obtain a minor’s consent to sell personal data, engage in targeted advertising, and engage in profiling if it offered an online service, product or feature to consumers the controller knows or willfully disregards are 13 to 17 years old. For minor’s under 13, the controller needed to obtain parental consent. The amended law now states that a controller cannot sell such personal data or engage in targeted advertising regardless of whether it obtains consent. Controllers still must obtain consent to engage in profiling and also have a new obligation to conduct an impact assessment in addition to a data protection assessment.

The amendments also remove the prohibition on using any system design feature to significantly increase, sustain or extend any minor’s use of such online service, product or feature. It also makes changes to the provision dealing with how adults can interact with minors on covered platforms.

Finally, the amendments both limit and expand the definition of “heightened risk of harm to minors.” It limits the current definition by adding the word “material” to “financial, physical, or reputational injury to minors” and “physical or other intrusion upon the solitude or seclusion, or private affairs or concerns of minors.” It expands the current definition by adding any physical violence against minors, any material harassment of minors on any online service, product or feature, which harassment is severe, pervasive or objectively offensive to a reasonable person, and any sexual abuse or sexual exploitation of minors.

What’s the Context

Changing the law’s approach to the sale of minor’s personal data and targeted advertising from consent to a straight prohibition follows a recent trend in some state data privacy laws. Maryland’s law passed last year contains similar provisions. And, this year, Oregon amended its law to prohibit the sale of personal data if a controller has actual knowledge or willfully disregards that the consumer is under 16 years of age.

It remains to be seen whether companies will argue that the changes are unconstitutional. For example, the law defines minors as consumers under 18 years of age, which includes those under 13. That creates overlap with COPPA, which allows parents to consent to processing activities involving their children. Although state laws can supplement COPPA, they cannot conflict with its provisions. See Jones v. Google LLC, 73 F.4th 636 (9th Cir. 2023) (“We hold that COPPA’s preemption clause does not bar state-law causes of action that are parallel to, or proscribe the same conduct forbidden by, COPPA”); see also COPPA, 15 U.S.C. § 6502(d) (“No State or local government may impose any liability for commercial activities or actions by operators in interstate or foreign commerce in connection with an activity or action described in this chapter that is inconsistent with the treatment of those activities or actions under this section”).

A Texas court also recently held that limitations on targeted advertising to teens under the Texas SCOPE Act were unconstitutional under the First Amendment. Students Engaged in Advancing Texas v. Paxton, 24-cv-00945-RP, *26 (Feb. 7, 2025) (“Turning to the targeted advertising requirements, the Court finds them likely to fail strict scrutiny for similar reasons. First, Paxton specified no compelling state interest. As with the monitoring-and-filtering requirements, there may exist a compelling state interest in promoting teen mental health by limiting teens’ exposure to certain advertising, but nowhere does the Court see a specific interest articulated for removing teens’ access to targeted advertising of all kinds, much less a compelling one, as is required.”).

That said, these remain debatable constitutional issues and ones that Connecticut elected officials could have decided were worth fighting for.

Data Broker Registration

Earlier versions of the bill contained a data broker registration requirement, however, this was removed from the final version.

Photo of David Stauss David Stauss

David is leader of Husch Blackwell’s privacy and cybersecurity practice group. He routinely counsels clients on responding to data breaches, complying with privacy laws such as GDPR and the California Consumer Privacy Act, and complying with information security statutes. He also represents…

David is leader of Husch Blackwell’s privacy and cybersecurity practice group. He routinely counsels clients on responding to data breaches, complying with privacy laws such as GDPR and the California Consumer Privacy Act, and complying with information security statutes. He also represents clients in data security-related litigation. David is certified by the International Association of Privacy Professionals as a Privacy Law Specialist, Certified Information Privacy Professional (US), Certified Information Privacy Technologist, and Fellow of Information Privacy.

Read more about David StaussDavid's Linkedin Profile
Show more Show less
Photo of Shelby Dolen Shelby Dolen

Clients and legal teams appreciate Shelby’s passion for the law as it relates to protecting technology and company assets. She regularly monitors and researches fast-changing consumer privacy laws, with the understanding that critical strategy and success for any business includes oversight of data…

Clients and legal teams appreciate Shelby’s passion for the law as it relates to protecting technology and company assets. She regularly monitors and researches fast-changing consumer privacy laws, with the understanding that critical strategy and success for any business includes oversight of data privacy policies and intellectual property portfolios.

Read more about Shelby DolenShelby's Linkedin Profile
Show more Show less
Marlaina Pinto

Marlaina Pinto is a summer associate at Husch Blackwell.

  • Posted in:
    Privacy & Data Security
  • Blog:
    Byte Back
  • Organization:
    Husch Blackwell LLP
  • Article: View Original Source

LexBlog logo
Copyright © 2025, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo