Digital Citizens hero image
Research-based Curation

Raising
digital citizens.

Evidence-based strategies to help your children navigate the digital landscape with confidence and agency.

Core Evidence

01
Play with your kids icon
Joint Media Engagement

Play with your kids

Research from the Joan Ganz Cooney Center shows that children learn significantly more when parents engage with them during screen time. Co-playing fosters 'Joint Media Engagement,' turning passive consumption into a social learning experience.

Learn more
02
Content, Context, Child icon
The Three C's

Content, Context, Child

Digital safety isn't just about time limits. Lisa Guernsey's framework suggests looking at: Content (Is it high quality?), Context (Does it support social interaction?), and the Child (Does it match their unique needs?).

Read full study
03
Privacy Literacy icon
Data Agency

Privacy Literacy

A 2024 Harvard study emphasizes that teaching children 'Privacy Literacy' early—understanding that their data is a valuable asset—is the most effective defense against predatory algorithms and data tracking.

Harvard research

The Sandbox Philosophy

Move from gatekeeper to digital mentor.

A digital sandbox is a curated environment where boundaries are established not through blocks, but through expert selection. By providing only high-signal, safe applications, you allow for safe exploration.

Collaboration icon

Digital Milestones

 icon
0-3 Years

Sensory Exploration

Focus on high contrast, slow-paced audio, and tactile interaction. No aggressive loops or flashing lights.

 icon
4-6 Years

Constructive Play

Storytelling, painting, and basic logic. Screens should be used for 'making,' not just 'watching.'

 icon
7-9 Years

7-9 Years

Complex puzzles, coding logic, and multiplayer co-operation in strictly moderated environments.

 icon
10+ Years

Critical Inquiry

Discussions about algorithms, misinformation, and the ethics of digital identity and data sharing.

Legal newsflash

A turning point for children's online safety

In March 2026, a landmark legal decision shook the tech world: Meta Platforms was found liable for failing to protect children on its platforms. Courts are now starting to say clearly that online safety for children cannot remain an afterthought.

A jury in New Mexico ordered Meta to pay $375 million after finding that the company failed to protect children, misled users about safety, and allowed harmful content and interactions to spread. Together with another U.S. case linking platform design to addiction and mental health harm, these rulings mark a major shift in how responsibility is assigned.

  • Failed to protect children from predators
  • Misled users about safety
  • Allowed harmful content and interactions to spread
News brief icon

Why this matters

  • Check icon This is not just about one company — it is about how the internet works for kids.
  • Check icon The court found that platforms prioritized engagement and growth over safety, while algorithms connected minors to harmful content and people.
  • Check icon Safety systems were found to be insufficient or misleading.
  • Check icon At the same time, another Los Angeles case found that platforms such as Instagram and YouTube contributed to addiction and mental health harm, awarding damages to a young user.
News brief icon

What risks the lawsuits highlight

  • Check icon Addictive design such as infinite scroll, autoplay, and algorithmic feeds can keep children engaged far beyond healthy use.
  • Check icon Children can be exposed to sexual content, violence, unrealistic body ideals, and dangerous trends.
  • Check icon Investigations showed that test child accounts were quickly targeted by predators and explicit messages.
  • Check icon Cases linked social media use to anxiety, depression, sleep disruption, and lower self-esteem.
News brief icon

What this means for parents in Sweden and Europe

  • Check icon Even though these cases happened in the U.S., the impact is global.
  • Check icon Europe is already seeing stronger digital safety rules, growing calls for age verification, and more pressure on platforms to be safe by design.
  • Check icon That likely means more parental control tools, more restrictions on children’s accounts, and clearer warnings about app risks.
  • Check icon Parents in Sweden and across Europe should expect these questions to become more visible in both regulation and product design.
News brief icon

What parents can do right now

  • Check icon Choose safer apps with no open chat with strangers, stronger moderation, and age-appropriate content.
  • Check icon Delay social media platforms like Instagram or TikTok as long as possible when the child is not ready.
  • Check icon Talk openly about online risks, teach critical thinking, and encourage children to report problems.
  • Check icon Use parental controls for screen time, content access, and app downloads while laws and platform rules catch up.
News brief icon

Why Safe Apps for Kids exists

  • Check icon Our goal is simple: help parents find apps that are actually safe, not just popular.
  • Check icon Big platforms are not always designed with children in mind.
  • Check icon Free apps often come with hidden risks.
  • Check icon Parents need independent, trustworthy guidance.
News brief icon

What happens next?

  • Check icon This is likely just the beginning.
  • Check icon Thousands of similar lawsuits are already underway.
  • Check icon There are growing calls for laws such as the Kids Online Safety Act and more pressure on Big Tech to redesign its platforms.
  • Check icon Some experts are already calling this a Big Tobacco moment for tech.

Final takeaway

For years, most responsibility was placed on parents. Now courts are beginning to say something bigger: tech companies must share that responsibility.