Skip to content

Header

The Kids Aren’t Alright: A Threat Intel Dad’s View of the Internet

September 2, 2025

Alex Gartner Alex Gartner

 

A few days ago, my wife and I welcomed our second son into the world. He’s beautiful, healthy, vulnerable in all the expected ways. But as I hold him, I find myself thinking less about wet diapers and hunger cues. 

Because even in the hospital, I was surrounded by internet-connected devices, and humans addicted to them. Myself, nurses, family members. 

What happens when he goes online for the first time? I can’t open YouTube for his older brother without sifting through “AI-reads-The-Very-Hungry-Caterpillar”.

Will I ever feel comfortable removing my self-elected restrictions from their access?

It’s like the parable of Noah’s Ark. A world-ending flood is in their future, and I’m already gathering the boat. Two video games without microtransactions, two productivity apps without predatory privacy policies…

As someone who works in Threat Intelligence, I can’t shake the concern that we are failing to protect the next generation.

The internet has become a central part of childhood, but it is far from a safe place. Within my team at Kandji, we’ve seen a sharp increase in phishing campaigns specifically targeting kids. Many of these arrive as PDFs promising free currency for games like Roblox or Minecraft. These aren’t generic scams. They’re carefully engineered social lures, tailored to what children care about most.

A sample PDF, advertising free “Pokemon Go” Pokecoins

When a twelve-year-old downloads one of these files and follows the link inside, the outcomes can be serious. They might give up passwords. They might be tricked into installing screen-sharing software. They might be coerced into turning on a webcam. There’ve been reports of threat actors impersonating children in chat apps to collect personal photos or social content, then threatening to share it if the child doesn’t comply. It is extortion, and it is happening quietly in bedrooms and school computer labs.

Kandji EDR’s quarantine detection

Worse still, the line between adult internet and child internet has become thinner than ever. Most children now have access to smartphones or laptops that are, effectively, unmanaged. They use the same apps and platforms as adults, with little to no differentiation in content, privacy protections, or threat modeling. Content that would once have been restricted to adult forums now surfaces in algorithmic feeds with no guardrails.

This is a direct result of Big Tech’s failure to proactively regulate its platforms. For years, they resisted accountability while profiting from engagement at all costs. The resulting disinformation pandemic and erosion of public trust should have been the turning point. 

But instead, we saw a strange reversal. Just as the world began to recognize the dangers, public pressure shifted. Somehow, platforms were branded as overreaching. Regulators backed off. And what little moderation existed became even more relaxed.

The result is an ecosystem where misinformation, exploitation, and predatory behavior not only persist, but often blend seamlessly into the digital experiences our children navigate every day. 

The tools kids use for homework, gaming, or chatting with friends are not neutral. They’re attack surfaces. And many of them are completely unmonitored.

We need to have an honest conversation about what this means, and who is responsible.

As cybersecurity professionals, we often focus on sophisticated malware, nation-state actors, and corporate compromise. But our responsibility extends beyond those high-value targets. We also have a duty to protect the weakest members of our digital society. That includes children, but it also includes parents, teachers, and the IT staff at underfunded schools who are often the last line of defense.

Security software has to be built with these use cases in mind. It needs to be accessible to schools and families, not just enterprises. But education is just as critical. We need to normalize conversations about online safety. Not just about stranger danger or social media filters, but about phishing tactics, manipulation campaigns, and the mechanics of scams. These are technical subjects, but they can be taught with clarity and compassion.

We should not wait until the damage is done to intervene. A child who has their privacy violated or their trust abused is not simply the victim of a “low-severity” attack. The effects can last for years. Online humiliation can follow a kid into high school or college. The psychological harm of being extorted, impersonated, or publicly exposed is real.

My sons will eventually be old enough to ask for iPads. They will want to play games, join chat servers, and watch videos. 

I will have to make choices about what protections I put in place, how early I talk to them about risks, and when I let go. Every parent will face that balancing act. But they deserve better tools and guidance than what most tech companies are offering today.

This is not a problem that can be solved by one sector alone. It requires cooperation between product teams, educators, policymakers, and the security industry. But threat intelligence professionals are uniquely positioned to spot early patterns, share them with the community, and build defenses that scale.

We must take that role seriously. Because whether or not we think of ourselves as educators or advocates, the work we do has direct implications for the safety of millions of children who don’t understand the risks they face.

If we want a future where kids can grow up exploring the internet instead of fearing it, then the work starts now. Not just in code, but in culture.

This article reflects the personal views of the author and does not necessarily represent those of Kandji.


About the Author

Alex Gartner is a father of two and a Cincinnati-based security professional. He serves as Sr. Engineering Manager of Security Research at Kandji, where he guides the discovery and analysis of emerging threats. Beyond his work in malware analysis and threat detection, Alex writes about the intersection of technology, safety, and family life.