The New Standard for YouTube Kids Content: What Dublin’s 2026 Summit Changed
YOU ARE HERE

“Growing Up in the Digital Age” Summit: the Takeaway for Kids’ Creators

Reading time

13 Min

Last updated

30 Mar 2026

“Growing Up in the Digital Age” Summit: the Takeaway for Kids’ Creators
Table of contents
Checklist
22 Steps to Grow from $500 to $10,000 on YouTube.pdf

On March 11, 2026, Google’s Safety Engineering Center in Dublin hosted the “Growing Up in the Digital Age” Summit. The AIR team had been there, spending hours with policymakers, child safety experts, and the leadership of both Google and YouTube to see exactly what the future of YouTube Kids looks like for creators in 2026. If you’re building a kids’ or teen brand, here’s the full intel.

First things first, at this summit, both Google and YouTube showed, in very concrete terms, how child and teen safety is being put in their products as a default. Dublin suggests that the era of kids’ content giving children something akin to a ‘sugar rush’ is ending. 

Prioritizing Child’s Safety Online

The summit brought together experts from many fields. There were child-safety experts, educators, policymakers, and the leadership of both Google and YouTube. All of them gathered together to share their thoughts and promote one idea: protecting kids while not implementing the easy solution of blanket restrictions all around. 

This might sound simple in theory, but it’s quite tricky in actual practise. 

Google’s position, repeated across the summit, is that young people need age-appropriate experiences, which can be achieved through stronger default guidance. Because, as we all know, walling kids off from the digital world altogether works only in theory. In practise, that only incentivizes teens and kids to lie about their age online to gain access to ‘adult spaces’. 

From the words shared at the summit, we can tell where the platforms are heading. There will be a stricter sorting process. In other words, more filtration around content for young audiences. 

New Announcements: What You Need to Know

With the general information out of the way, let’s look at some very important announcements closer to see the direction the online world will take to keep our children safe: 

“Safe by Default”: Strengthening the Baseline Protections

The first thing that was talked about a lot at the summit was the “safe by default” baseline for under-18 users. On YouTube alone, users under 18 would automatically have their uploads private. Google and YouTube have also mentioned that for accounts under 18, the “SafeSearch” feature will be enabled by default as well.

Moreover, there will be tools implemented to remind younger users to take breaks and that it’s their bedtime. These reminders would be turned on automatically as well. 

Google also said the Gemini experience for minors will now include non-optional safeguards. So, for example, if your child decided to talk to this AI bot, the bot would automatically see whether or not the user is under 18, and if they are below that age, Gemini would avoid language that simulates intimacy, companionship, or human identity. 

Tighter Parental Controls

YouTube is always working to both tighten and simplify how parental control is operated. In Dublin, they presented improvements to the Family Link tool with a simpler control layer where parents will be able to manage device settings, usage summaries, and screen-time limits, all in one place. 

On YouTube specifically, parents of teens with supervised accounts can already set limits on Shorts consumption, and YouTube says a timer set to zero is coming as an industry-first option. 

$20 Million To Support Teens' Digital Wellbeing Worldwide

In Dublin, Google and YouTube announced a $20 million global teen digital wellbeing initiative. The funding will support a multilingual, open-source resource center and curriculum, backed by an Ipsos study of more than 9,500 teens across 36 markets

The stated goal is to help teens, parents, caregivers, and educators deal with digital stress, build healthier habits, and understand how to interact with AI responsibly. YouTube creators are an explicit part of that delivery model, alongside nonprofits including Young Futures, Plan International, and the Center for Public Impact. 

What Counts as Appropriate for Teens

Back in January, YouTube introduced new principles and a creator guide for content that is “fun, age-appropriate, higher-quality, and more enriching.” In Dublin, they doubled down on those guides and formalized the rules. 

The strategic point was made clear: those principles inform the recommendation system, which means YouTube is using them to raise the frequency with which certain videos are shown to teens. 

If you want to know more about what’s appropriate and what isn’t for teens.

Read the guidelines compiled by YouTube for free.

Here’s the gist of good vs. bad in teen content. What you should and shouldn’t do:

Low Quality Principles for Teen Content

What to Do Instead

Narrow body standards and comparisons

It’s okay to share helpful tutorials, DIY projects, skills, creativity, humor, or positive challenges that build confidence, uplift, and inspire teens.

Dangerous acts and negative behaviors

It’s okay to share clearly scripted skits or light-hearted pranks, as long as everyone’s in on the joke, no one gets hurt or embarrassed.

Bullying, hate, and disrespect

It’s okay to show certain types of criticism or make playful jokes about quirks, habits, or fictional scenarios, as long as they don’t promote, target, or harm an individual's or a group's appearance or identity. 

Wealth obsession and misconceptions

It’s okay to talk about personal experiences with saving, spending, investing, or managing money responsibly, as long as you set realistic expectations.

Aggressive and intimidating behavior

It’s okay to show fictional violence in video games or scripted content with actors or participants who understand it’s staged, and it clearly doesn’t promote real-world harm.

Age Assurance According to the Content Risk

Now, onto the controversial news among the viewers. Google is indeed doubling down on a risk-based model for age assurance. The company argues the choice shouldn’t be between weak age gates (that kids can lie about) and invasive ID checks for everyone. 

Instead, the level of age assurance should match the level of content risk. Google says it’s supporting interoperable standards and open-sourcing technology to enable more privacy-preserving age checks where needed. 

Age verification is here to stay, and it’s becoming more sophisticated. 

Acknowledging Nuance

Finally, Dublin pushed back against the one-size-fits-all politics of digital restriction. Google’s recap, for example, argues that blanket bans can push young users toward less regulated spaces and strip away the parental controls and supervised experiences designed to keep them safer. 

What This Means For YouTube Kids Creators

For the longest time, creators in kids and family content could say the right things about educational value while still building videos around overstimulation, repetition, frictionless autoplay logic, and thin narrative payoff. That is getting harder and harder to sustain since COPPA 2.0. and GDPR-K were introduced. 

The safety of the kids is now becoming a clear priority. And the safe monetization practices for kids’ content are becoming central in 2026. The conference changed the approach to kids’ content and its safety, but the core practices remain the same.

After working with over 3,000 channels, we know how to help Kids’ channels adjust and be on top of every little change and shift of YouTube’s algorithms. Reach out to us to get your own workshop on kids’ content quality. 

Want a clear 34-rule roadmap? 

Download the Full PDF: The 34 Essential Rules for Kids’ Content Success in 2026

Prioritizing the Viewer’s Experience

YouTube puts the viewer's experience as a priority. Kids’ content is an environment, and it’s judged by whether or not it brings good experiences for younger audiences. 

That is why the Dublin messaging combined recommendation quality, parental controls, AI guardrails, screen-time tools, and age assurance into one conversation. From the platform’s point of view, these policies are a part of the same ecosystem. And that means creators will increasingly be evaluated not only on single-video compliance, but on whether their content feels aligned with the ecosystem YouTube wants parents and policymakers to trust.

“Growing Up in the Digital Age” summit suggests the platform is trying to distinguish between content that’s meant to simply keep a child’s attention and content that supports a child’s development. It means that YouTube Kids will see less manipulative stimulation

Teens vs. Kids: Differences in Strategy

Teens and kids are no longer a singular umbrella “kids” category. YouTube’s teen principles are now more explicit, and they sit alongside existing kids’ quality principles, which means that creators of the broad “kids and family” lane need to get more precise about their targeting strategies. 

A video that works for every category of YouTube Kids’ viewership is hard to make, and, not to mention, practically impossible. The platform is segmenting the experience more carefully, which means that creators should, too. What is more, teen safety experts, educators, and policymakers are actively working to build the kind of online world that protects, respects, and empowers young people online. 

That means: 

  • Teens' needs in the digital environment are being prioritized 
  • Parental controls grow stricter
  • There’s an active emphasis on safe generative AI for teens in the future
  • CSAE in Gen AI is being targeted more than ever
  • Age verification at Google services is becoming a default
  • AI is seen as a future contributor to education (while not replacing human work)

AI in Kids’ Content: What’s Allowed and What Isn’t

Some people love it, others hate it, but there’s no denying that the future of YouTube Kids will be with AI. However, such content is about to be scrutinized through a whole different lens. The Gemini safeguards discussed in Dublin alone were about conversational AI, but the principle remains the same

Both this and the newest updates concerning AI on YouTube show a tendency: platforms are openly worried about minors forming emotional and intellectual dependence on AI systems. Mindless content with heavy reliance on AI is already banned from the platform (or demonitized at the very least), which means that YouTube Kids might be more interested in human-made, authentic content. 

What’s Next: How to Adapt

Look at your channel the way a parent, a policy team, and a recommendation system would look at it, and start asking yourself really important questions:

  • Does your content bring value to kids? 
  • Does the channel communicate age fit clearly? 
  • Does it feel coherent? 
  • Does it contribute something enriching? 

Dublin showed us that YouTube is building a new contract for youth content: more safety by default, more parental control, more explicit recommendation principles, smarter age assurance, and a stronger public argument that better digital experiences matter more than blanket restrictions. 

The new standard is curious, joyful, and real content. 

If you would like to adjust yours to fit the current standard, we are here to help!

YouTube
rolled out a drop!
We explained it.

Watch image

Hit our socials,
all the news are there.

More to Explore

Show all