The Verdict Was Always Coming
A jury just told Meta and Google what parents already knew — their platforms are unsafe for children. We built Kidoodle.TV because we believed it didn't have to be this way.
Six million dollars.
That is what a Los Angeles jury decided Meta and Google owe a young woman named Kaley, who started using YouTube at age 6 and Instagram at 11 — and spent the years that followed battling depression, body dysmorphia, and an inability to put her phone down.
The verdict came down yesterday, March 25th. And a day before that, a separate jury in New Mexico slapped Meta with $375 million for violating child safety laws. Two verdicts in two days. After years of hand-wringing, congressional hearings, and sternly worded letters that changed nothing, a courtroom finally said what millions of parents have been saying all along: these platforms are not safe for children.
“Bring Them in as Tweens”
What made the Los Angeles trial so damning was not just the outcome — it was the evidence. Internal Meta documents revealed a deliberate strategy to hook young users early. One memo, entered into the court record, put it plainly: “If we wanna win big with teens, we must bring them in as tweens.”
Think about that sentence for a moment. This is not a company failing to anticipate harm. This is a company engineering for it. Infinite scroll, autoplay, push notifications, beauty filters, the dopamine loop of likes — plaintiff’s attorney Mark Lanier called it “the engineering of addiction.” The jury agreed. They found that the defective design of these platforms, not the content users posted, caused the harm. That distinction matters. It means Section 230 — the shield Big Tech has hidden behind for decades — did not apply.
One juror, Victoria, told reporters: “We wanted them to feel it. We wanted them to realize this was unacceptable.”
This case is a bellwether. There are roughly 2,000 more lawsuits lined up behind it — from parents, from school districts, from communities that watched a generation of kids disappear into their screens.
We Built the Other Way
I work at A Parent Media Co. We built Kidoodle.TV — and we built it because we believed from day one that children’s media did not have to work this way.
Kidoodle.TV is not an algorithm optimized for engagement. It is a streaming platform where every single piece of content is reviewed by a human being before a child ever sees it. Not flagged by an algorithm after the fact. Not moderated by community reports. Reviewed. In advance. By real people — many of them parents and grandparents — who watch every episode and apply standards rooted in child development, not advertising revenue.
There are no comment sections. No push notifications designed to pull a kid back in. No beauty filters. No likes. No follower counts. No algorithmic rabbit holes that start with a cartoon and end somewhere no child should be.
The platform is COPPA-compliant and carries the kidSAFE+ Seal — one of the most rigorous independent certifications for children’s digital products. It is available in over 160 countries on more than 1,000 devices. And it has been doing this for over a decade, long before the lawsuits, long before the congressional hearings, long before “child safety” became a talking point in every tech company’s quarterly earnings call.
We did not retrofit safety onto a platform built to maximize time-on-screen. We started with safety and built everything else around it.
The Difference Is the Starting Point
The Meta memo — “bring them in as tweens” — tells you everything you need to know about where that company’s design process begins. It begins with acquisition. With engagement. With retention metrics that treat a child’s attention as a resource to be extracted.
Our starting point was different. It was a question: What does a streaming platform look like when the first design constraint is that a child must be safe?
That question changes everything. It changes what content gets on the platform. It changes what features you build — and, more importantly, what features you choose not to build. It changes how you measure success. We do not measure success in time-on-screen. We measure it in trust — the trust of the parents who hand their child a device and walk into the next room.
Yesterday’s verdict will be appealed. Meta and Google will spend years and hundreds of millions of dollars fighting it. But the legal question is already answered in the court of public opinion: parents know these platforms are not safe for their kids.
The harder question — the one that matters more — is whether the industry is willing to build differently.
We were. We did. And we have been for ten years.