On April 15, 2026, a 14-year-old student walked into Ayser Çalık Middle School in Kahramanmaraş, Turkey, carrying five firearms and seven magazines. By the time it was over, nine people were dead. One of them was a math teacher named Ayla Kara, who died trying to shield her students.
The attacker, İsa Aras Mersinli, also died. He turned the gun on himself.
In the hours that followed, investigators began working through his digital life. What they found was not a mystery. It was a map, drawn in plain sight, that nobody had thought to read.
A Profile Picture as a Manifesto
Mersinli’s WhatsApp profile picture was a photo of Elliot Rodger.
For those unfamiliar with the name: Rodger was a 22-year-old who, in May 2014, killed six people in Isla Vista, California before taking his own life. Before the attack, he uploaded videos explaining his motivations and left behind a 137-page document he called his manifesto. He had become, in the years since, an icon in online communities built around male grievance, rejection, and the glorification of mass violence. He is arguably the most influential figure in the so-called “incel” subculture that has since been linked to multiple attacks across North America and Europe.
A teenager in southern Turkey had put this man’s face as his public-facing identity. He had done it voluntarily. He had done it where anyone who knew him could see it.
His computer also contained a document, dated April 11, 2026, four days before the attack, describing what he was planning to do.
The Turkish Prosecution’s Office confirmed the attack was premeditated. There was no spontaneity here. There was a timeline, a target, and a model.
The Telegram Layer
The attack in Kahramanmaraş did not happen in isolation. The same week, another school in Şanlıurfa was targeted. And in the immediate aftermath, a Telegram group called “C31K,” with approximately 100,000 members, began circulating messages celebrating the attackers and posting specific schools, dates, and locations as the next targets.
Turkish authorities identified 591 social media accounts spreading what they described as disinformation and provocation. Cybercrime units opened investigations. The group had previously been connected to two separate murder cases in Turkey.
This is a pattern that European security analysts should be paying very close attention to.
What we are seeing is not just copycat violence driven by media coverage. It is something more structured: online communities that actively cultivate the mythology of mass attackers, offer belonging to isolated and radicalized young people, and then, after an attack, use it as recruitment material and operational inspiration for the next one. The digital infrastructure of this ecosystem runs primarily through encrypted messaging platforms, gaming communities, and fringe imageboards, most of which operate with minimal oversight.
The Incel Pipeline and Its European Reach
The incel phenomenon is not a Turkish story. It is not an American one either. Since Rodger’s 2014 attack, the ideological framework he helped define has been cited in mass casualty events in Canada, the United Kingdom, Germany, and Finland. In 2018, a van attack in Toronto killed ten people. In 2021, a man in Plymouth, England killed five. In both cases, investigators found significant engagement with incel forums and, in particular, with content venerating Elliot Rodger specifically.
What connects Kahramanmaraş to Plymouth to Toronto is not geography. It is an online ecosystem that operates without borders and targets young men who feel, for whatever reason, invisible and powerless.
In Mersinli’s case, the behavioral indicators were present. Teachers described him as withdrawn and increasingly isolated. He was reportedly spending large amounts of time online. His social media profile displayed an open tribute to a foreign mass killer. His computer contained a pre-attack document. None of this triggered a formal intervention.
What Monitoring Actually Means
There is a legitimate debate in Europe about the limits of social media surveillance, and it is a debate worth having. The GDPR framework, fundamental rights law, and democratic principles all impose real constraints on how states can monitor private communications. Those constraints exist for good reason.
But what happened in Kahramanmaraş was not a question of encrypted messages or private channels. Mersinli’s profile picture was visible to anyone who had his contact. His behavioral isolation was observable to teachers and classmates. The warning signs were not hidden. They were unread.
This is where the conversation about monitoring needs to be more precise. The choice is not between mass surveillance and willful blindness. There is a middle space that involves:
Training educators and school counselors to recognize the specific behavioral and digital markers associated with radicalization toward mass violence. A student who idolizes a foreign school shooter is not simply “troubled.” That is a specific and documented warning sign with its own established literature.
Building structured reporting pathways between schools and threat assessment teams. Several European countries, including Germany and the Netherlands, have developed multi-agency behavioral threat assessment programs modeled on the US Secret Service’s work in this area. These programs work when they are actually resourced and integrated into school environments.
Monitoring open-source digital signals without requiring access to private communications. What Mersinli displayed on his profile was open-source. A school or a social worker or a platform flagging system could theoretically have caught it. The question is whether any of those systems were in place or whether anyone was looking.
Engaging platform providers on incel content specifically. The Telegram group that celebrated the Kahramanmaraş attack had 100,000 members and had already been linked to two previous murders. That it continued to operate until this moment is a failure of platform governance, not an intelligence gap.
The Copycat Problem Is Structural
One of the more uncomfortable findings in mass violence research is that extensive media coverage of attackers, particularly sympathetic or fascinated coverage that focuses on the attacker’s psychology and background, demonstrably increases the probability of subsequent attacks. The so-called “contagion effect” has been documented in peer-reviewed research for decades.
Online communities have essentially weaponized this effect. They do not just report on attacks. They archive them, analyze them, build personas around the perpetrators, and actively market them as role models to young men who are already vulnerable. Elliot Rodger has been dead for twelve years. He has, in that time, inspired more violence than he carried out himself.
Mersinli did not invent his frame of reference. Someone, or more likely some community, handed it to him. That community is still operating. It has 100,000 members in a single Telegram group in Turkey alone.
What Europe Should Be Doing Differently
The EU’s Digital Services Act, fully in force since 2024, creates obligations for very large online platforms to assess and mitigate systemic risks, including risks to public security. Mass violence glorification communities of 100,000 members on major messaging platforms are, by any reasonable reading, a systemic risk. Whether the DSA’s enforcement mechanisms are being applied to this specific category of content with any seriousness is an open question.
At the national level, the more practical gap is in threat assessment capacity at the local level. National intelligence services are not positioned to monitor every isolated teenager in every European school. But schools, social services, and local police can be. They need better tools, better training, and better coordination structures to do it.
The UK’s Channel program, Germany’s VERA-2 radicalization assessment tool, and the Netherlands’ integrated approach to neighborhood-level threat assessment all represent the kind of infrastructure that can catch individuals before they cross a threshold. The precondition is that people in daily contact with at-risk youth know what they are looking for.
A boy who puts Elliot Rodger on his profile picture is telling you something. The question is whether anyone in his environment had been taught to hear it.
A Note on What This Is Not
This is not a case where better surveillance technology would have made the difference. It is not a case for reading teenagers’ private messages or expanding state access to encrypted communications. The signals that Mersinli sent were not encrypted. They were in plain view on a platform billions of people use every day.
What failed was human awareness, institutional training, and platform responsibility. Those are solvable problems, and solving them does not require trading privacy for security. It requires investment, coordination, and a clearer-eyed understanding of how radicalization toward mass violence actually works in 2026.
