On March 2, 2026, the UK Department for Science, Innovation and Technology (“DSIT”) launched its consultation, titled “Growing up in the online world: a national conversation”. The consultation is open until 26 May 2026, after which the government will publish a summary of responses and its proposed approach. DSIT has indicated that it intends to move quickly on the consultation’s findings, drawing on newly granted powers that allow for accelerated implementation of online safety measures.

The consultation seeks views on a wide range of potential measures to strengthen children’s safety and wellbeing online, including more robust age‑assurance mechanisms, a statutory minimum age for social media, raising the UK’s age of digital consent, restrictions on certain features (such as livestreaming and disappearing messages), and new obligations for AI chatbots and generative‑AI services.

DSIT’s proposals could significantly expand regulatory expectations beyond the Online Safety Act 2023 (“OSA”)—including potential age‑based access limits (including differing safeguards as between teens and younger children), feature‑level restrictions, and enhanced duties for AI‑enabled services. Early engagement will be important to ensure that the government takes account of the views of affected service providers and understands the operational and technical implications of the measures proposed.

Background

DSIT intends to examine whether further statutory measures are needed to strengthen protections for children online. This initiative comes on the heels of recent government reforms—including 48‑hour takedown requirements for the sharing of non‑consensual intimate images and new priority offences relating to self‑harm content, intimate image abuse, cyberflashing and nudification apps.

The consultation is organised around five themes: (i) understanding children’s use of technology; (ii) interventions to promote a safer, more positive online experiences; (iii) compliance with and enforcement of online safety rules; (iv) preparing children for a digital future; and (v) support for families. DSIT positions these themes within the existing regulatory framework—the OSA, the ICO’s Children’s Code, and UK GDPR. DSIT asks whether further statutory guardrails are required, particularly around platform design features, high-risk functionalities, and generative AI tools.

Age Thresholds and Access Controls

DSIT is consulting on a statutory minimum age of access for social media services (ages 13, 14, 15, or 16). It also asks whether the UK GDPR’s age of digital consent (currently set at 13) should be increased, noting that any increase would require more robust parental verification and expanded compliance measures. DSIT notes that strengthened age‑assurance expectations—including distinguishing users in the 13–16 age range—may require providers to adopt layered or adaptive age‑estimation and verification methods that are more robust than those currently used across many services. DSIT also recognizes that age-assurance requirements would require adults to complete age checks so that services can distinguish them from children.

The consultation further raises the possibility of imposing age‑based restrictions for high‑risk functionalities such as livestreaming, disappearing content, location sharing, and interactions with unknown users (also called “stranger pairing”). This signals a potential shift toward feature‑level controls in addition to service‑level access rules.

Persuasive and Compulsive Design

DSIT is also examining design mechanisms that can encourage prolonged use among young people and is seeking evidence on whether defaults or friction‑based interventions are needed to reduce compulsive use patterns. DSIT asks respondents to identify which persuasive features are most influential for children (e.g., infinite scroll, autoplay, ‘likes,’ etc.). DSIT also asks whether platforms should be required to impose time‑based interventions for children—such as defined daily limits or overnight curfews on social media services.

While the ICO’s Children’s Code already discourages nudge techniques that exploit children’s vulnerabilities, DSIT is considering moving towards more prescriptive requirements. These could include age‑based restrictions on persuasive design features and limits on personalised algorithms for teenagers. DSIT’s focus extends beyond harmful content – DSIT is assessing whether content recommendation algorithms or providing personalised content (regardless of the material surfaced) should be age-restricted due to their potential to drive compulsive use. The consultation also asks whether any such new requirements should apply beyond traditional social media to gaming services, livestreaming, and creative platforms.

Chatbots and AI

The consultation includes a dedicated section on AI chatbots and generative AI tools, which are increasingly used by children as a search function, for educational purposes and creativity. DSIT highlights several emerging risks associated with these tools, including highly personalised conversational behaviour, the potential for emotional dependency, and exposure to inaccurate or inappropriate outputs that cannot be reliably moderated. DSIT also raises concerns about anthropomorphic design, empathy mimicking and conversational memory – features that may seem to blur the boundaries between human and machine interaction. DSIT says it is particularly concerned about emerging evidence of children forming parasocial attachments to chatbots, which may necessitate limits on design features that mimic friendship, empathy or emotional reciprocity.

DSIT notes that many AI tools fall outside the current scope of the OSA because they do not enable user‑to‑user interactions, prompting DSIT to consider whether additional safeguards, minimum‑age controls or feature‑specific restrictions should apply to AI services. DSIT also notes that the government intends to introduce new powers to bring certain AI chatbots within scope of illegal‑content duties under the OSA, signaling further regulatory expansion.

Compliance, Enforcement and Digital Skills

Alongside substantive proposals, the consultation seeks views on how any new requirements should be enforced, particularly through more reliable, proportionate and privacy‑preserving age assurance. DSIT indicates support for a layered or successive validation approach where age estimation is the default and verification is only escalated when confidence is low. The consultation emphasizes that effective enforcement depends on age assurance that is accurate, trustworthy, user-friendly and robust against circumvention. DSIT is therefore gathering evidence on the ways children bypass existing checks, including through account‑sharing, the use of Virtual Private Networks (VPNs) and non-technical methods.

Beyond compliance and enforcement, the consultation examines how to prepare children for a digital future, including strengthening digital and media literacy, clearer guidance for parents and carers, and greater visibility of high‑quality and age‑appropriate content. DSIT emphasizes that regulatory controls must be paired with education, parental support and platform‑level clarity so that children develop the skills and resilience needed to navigate digital environments safely and confidently.

Next Steps

DSIT has already signaled that it intends to act quickly on the consultation’s findings, drawing on recently granted powers that enable the government to pursue accelerated implementation of new online safety measures via secondary legislation. It is soliciting responses up until 26 May 2026, as well as offering parallel questionnaires for children and parents and a wider programme of structured engagement. Depending on the evidence received, the government’s response may include legislative changes or updates to existing regulatory frameworks such as the OSA or the Children’s Code. Companies likely to be affected by the proposals should assess potential implications early and consider contributing evidence to help shape the direction of the UK’s evolving online safety regime.

In parallel with this consultation, on March 12, 2026, UK’s Ofcom and ICO issued coordinated letters to several social media and video‑sharing platforms requesting that they move beyond self‑declaration and implement “highly effective” age checks, enforce their minimum age policies, and strengthen protections against grooming and harmful algorithmic recommendations. Contacted organizations must report to Ofcom by April 30, 2026, on how their approaches align with the OSA, while the ICO expects urgent improvements with respect to age assurance within two months. This intervention underscores the direction of travel set out in the DSIT consultation, with government and regulators converging on a more assertive, design‑focused model of child protection online.

This blog was written with the assistance of Zara Maruna, a trainee solicitor in our London office.

Photo of Jadzia Pierce Jadzia Pierce

Jadzia Pierce advises clients developing and deploying technology on a range of regulatory matters, including the intersection of AI governance and data protection. Jadzia draws on her experience in senior in house leadership roles and extensive, hands on engagement with regulators worldwide. Prior…

Jadzia Pierce advises clients developing and deploying technology on a range of regulatory matters, including the intersection of AI governance and data protection. Jadzia draws on her experience in senior in house leadership roles and extensive, hands on engagement with regulators worldwide. Prior to rejoining Covington in 2026, Jadzia served as Global Data Protection Officer at Microsoft, where she oversaw and advised on the company’s GDPR/UK GDPR program and acted as a primary point of contact for supervisory authorities on matters including AI, children’s data, advertising, and data subject rights.

Jadzia previously was Director of Microsoft’s Global Privacy Policy function and served as Associate General Counsel for Cybersecurity at McKinsey & Company. She began her career at Covington, advising Fortune 100 companies on privacy, cybersecurity, incident preparedness and response, investigations, and data driven transactions.

At Covington, Jadzia helps clients operationalize defensible, scalable approaches to AI enabled products and services, aligning privacy and security obligations with rapidly evolving regulatory frameworks across jurisdictions—with a particular focus on anticipating enforcement trends and navigating inter regulator dynamics.

Photo of Dan Cooper Dan Cooper

Daniel Cooper heads up the firm’s growing Data Privacy and Cybersecurity practice in London, and counsels clients in the information technology, pharmaceutical research, sports and financial services industries, among others, on European and UK data protection, data retention and freedom of information laws…

Daniel Cooper heads up the firm’s growing Data Privacy and Cybersecurity practice in London, and counsels clients in the information technology, pharmaceutical research, sports and financial services industries, among others, on European and UK data protection, data retention and freedom of information laws, as well as associated information technology and e-commerce laws and regulations. Mr. Cooper also regularly counsels clients with respect to Internet-related liabilities under European and US laws. Mr. Cooper sits on the advisory boards of a number of privacy NGOs, privacy think tanks, and related bodies.