When Penelope Sokolowski died last year at the age of 16, her father believed at first that her death was the tragic result of private struggles he had not fully understood in time. But as he began piecing together her digital life after her passing, he says a far more disturbing picture emerged—one he believes points to prolonged online grooming that began years earlier on Roblox, a platform used by millions of children worldwide.
Penelope’s father, Jason Sokolowski, now speaks publicly not only as a grieving parent, but as someone who believes systemic failures in online safety allowed his daughter to be targeted, manipulated, and ultimately harmed.
Penelope died in February 2025. She was Jason’s only child.
In interviews, Jason describes a girl who was creative, sensitive, and deeply artistic, with a love for drawing, animation, and online games that allowed her to express herself. Like many children her age, she gravitated toward virtual worlds long before adolescence.
Penelope first joined Roblox when she was around seven or eight years old. At the time, Jason says, the platform appeared harmless.
Roblox presents itself as a universe of user-created games and social spaces, where players roam freely, customize avatars, and interact with others through chat. To many parents, it resembles a colorful digital playground rather than a social network with adult users.
Jason recalls sitting beside his daughter while she played, asking questions, and watching her draw anime-style artwork for friends she had met in-game.
He believed he was being attentive. He believed he was doing what responsible parents were supposed to do.
“We kind of thought we were covering all the bases,” he said, explaining that the family also used a third-party monitoring app to track Penelope’s online activity.
What Jason says he did not understand at the time was how Roblox could serve as a gateway rather than the final destination.
According to Jason, his daughter was eventually contacted by someone he believes to be a predator who initiated contact on Roblox and later moved conversations to Discord, where private, unmonitored chats can take place for hours at a time.
Jason says the relationship spanned nearly two years.
During that time, Penelope’s behavior slowly changed.
When she was around 13, Jason separated from Penelope’s mother and moved out of the family home in Vancouver. He believes the emotional disruption may have made his daughter more vulnerable, though he emphasizes he does not see this as the cause.
By age 14, Penelope’s grades began to fall. Jason noticed she was withdrawing socially and becoming increasingly anxious.
Then he noticed the scars.
Penelope had been harming herself, covering the marks on her arms with bracelets and oversized clothing. When Jason confronted her, she told him she had been drawn into a self-harm group she encountered online through Roblox.
She assured him she had left it behind.
Jason wanted to believe her.
He says he sought help, tried to support her emotionally, and believed they were moving forward.
But less than two years later, Penelope took her own life.
After her death, Jason says he opened her phone, hoping to find answers that might help him understand what he missed.
Instead, he says he found what felt like evidence of a prolonged psychological assault.
According to Jason, the phone contained messages, images, and videos exchanged between Penelope and an online contact who encouraged her to hurt herself and reinforced feelings of worthlessness and dependence.
Jason alleges that the individual coerced Penelope into acts of self-harm and demanded proof in the form of images and videos.
He believes the contact was part of a group known as 764, which the FBI has previously described as an extremist online network that targets minors through grooming, coercion, and manipulation.
According to federal authorities, members of 764 and similar networks seek out children on mainstream platforms, then escalate control through psychological pressure, fear, and isolation.
Jason believes his daughter was one of those targets.
“They are grooming girls to do whatever it is they can get a girl to do,” he said. “They break them down over time.”
He describes the process not as a single interaction, but as a gradual erosion of autonomy and self-worth.
In Jason’s account, Roblox was not where the abuse fully occurred—but it was where it began.
Experts say this pattern is common.
According to attorneys representing families in lawsuits against Roblox, initial contact often takes place on child-friendly platforms where predators can blend in, before conversations move to encrypted or less-regulated messaging apps.
At least 45 percent of Roblox users are under the age of 13, based on company data from 2023, making it one of the largest online spaces where children and adults coexist.
Over the past two years, dozens of families across the United States and Canada have come forward with similar claims.
More than 100 lawsuits have now been filed against Roblox, many of which have been consolidated into a single federal case in the Northern District of California.
The first hearing in that case took place on January 31.
Attorney Matthew Dolman, who represents multiple plaintiff families, argues that Roblox’s design makes it too easy for adults to communicate with minors without meaningful safeguards.
“Nowhere else in society do we accept unrelated adults speaking freely with children,” Dolman said. “But that’s exactly what happens on these platforms.”
According to Dolman, many cases begin with predators offering children Robux, Roblox’s in-game currency, as a form of grooming and control.
Once trust is established, communication often moves off-platform.
In at least 51 lawsuits, Discord is named as a co-defendant. Snapchat and Meta have also been cited in multiple cases.
Dolman says the alleged abuse spans a wide range of behaviors, from coercion and exploitation to threats and physical harm.
“There is no making these kids whole again,” he said.
Roblox has denied wrongdoing but acknowledges the seriousness of the allegations.
In late 2025, the company rolled out new safety measures, including AI-based age estimation for users who engage in communication features, as well as tighter restrictions on chat and content sharing for younger users.
Roblox says it does not allow user-to-user image sharing and employs filters to block personal information.
In a statement, the company said it is “deeply troubled by any incident that endangers any user” and emphasized ongoing efforts to improve safety.
Critics argue those changes came too late.
Jason Sokolowski remains unconvinced.
He believes platforms like Roblox prioritize growth and engagement over child safety, and that moderation tools cannot replace proactive protection.
A year after his daughter’s death, Jason says he is still grappling with grief—but he has also found purpose in speaking out.
He believes that if parents understood how these platforms operate, they would demand stricter controls or limit access altogether.
“Parents think they’re safe because the game looks innocent,” he said. “But predators know exactly where to go.”
Jason now calls for stronger regulation of online platforms used by children, including age verification, limited cross-age communication, and greater accountability when harm occurs.
He also urges parents to treat online spaces with the same caution they would physical ones.
“If you wouldn’t leave your child alone in a room with strangers,” he said, “you shouldn’t assume it’s safe online either.”
Mental health experts say cases like Penelope’s highlight the intersection of adolescent vulnerability, digital exposure, and inadequate safeguards.
They stress that while online platforms can offer creativity and connection, they can also amplify harm when oversight fails.
For Jason, the focus is no longer on blame alone, but on prevention.
He says he cannot change what happened to his daughter—but he hopes her story can help protect others.
“This didn’t happen overnight,” he said. “It happened slowly, quietly, and online.”
If you or someone you know is struggling with thoughts of self-harm, professional help is available. In the U.S., support is available through the Suicide & Crisis Lifeline by calling or texting 988. In the UK & Ireland, Samaritans are available at 116 123. In Canada, call or text 988.







