Generation Alpha’s coded language makes online bullying hard to detect
Adults and AI models fail to recognise messages with harmful intent expressed with Gen Alpha slang or memes, raising concerns about youngsters’ online safety
By Chris Stokel-Walker
25 June 2025
Teenagers’ language might make online bullying hard to detect
Vitapix/Getty Images
Generation Alpha’s internet lingo is mutating faster than teachers, parents and AI models can keep up – potentially exposing youngsters to bullying and grooming that trusted adults and AI-based safety systems simply can’t see.
Manisha Mehta, a 14-year-old student at Warren E. Hyde Middle School in Cupertino, California, and Fausto Giunchiglia at the University of Trento, Italy, collated 100 expressions and phrases popular with Generation Alpha – those born between 2010 and 2025 – from popular gaming, social media and video platforms.
Read more
Google tool makes AI-generated writing easily detectable
The pair then asked 24 volunteers aged between 11 and 14, who were Mehta’s classmates, to analyse the phrases alongside context-specific screenshots. The volunteers explained whether they understood the phrases, in what context they were being used and if that use carried any potential safety concerns or harmful interpretations. They also asked parents, professional moderators and four AI models – GPT-4, Claude, Gemini and Llama 3 – to do the same.
“I’ve always been kind of fascinated by Gen Alpha language, because it’s just so unique, the way things become relevant and lose relevancy so fast, and it’s so rapid,” says Mehta.
Among the Generation Alpha volunteers, 98 per cent understood the basic meaning of the terms, 96 per cent understood the context in which they were used and 92 per cent could detect when they were being deployed to cause harm. But the AI models only recognised harmful use in around 4 in 10 cases – ranging from 32.5 per cent for Llama 3 to 42.3 per cent by Claude. Parents and professional moderators were no better, spotting only around a third of harmful uses.