top of page

Radicalization: How “X” Potentially Normalizes Violent Extremism

  • Mar 2
  • 3 min read

Scrolling on “X”, you get the impression that the only kind of entertainment being pushed lately is video content that becomes progressively more extreme and violent. The possibility of varying content and themes is becoming smaller and smaller. It’s the algorithm that determines what you might like in the next video, creating a chain reaction of content inspired by the first one.


Thus, after a few minutes, you find yourself drawn into a series of increasingly radical contents, which initially seem reasonable, critical of public figures you may not like or adverse to certain economic policies adopted by government institutions. In reality, it is a process partially caused by changes in the architecture of the platforms themselves.


The case of “X”, formerly known as Twitter, is emblematic because what sets the platform apart is the speed, scale, and perceived legitimacy that extreme videos and disinformation can acquire in a very short time. The architecture of the platform systematically rewards content that promotes conflict and ideological escalation, resulting in a normalization of violence and distrust towards mainstream institutions and media. Radicalization is no longer a side effect of digital media; it is a well-established feature of the system.


In the case of X, the ownership change, under the name of freedom of expression, produced a major side effect: users are exposed to increasingly radical and extreme content, turning digital platforms into echo chambers of hate. The fact that, by default, “X” favors the polarization of opinions and content is bringing people to normalize a certain type of language and behavior, which increasingly reverberates in the streets: in harassment campaigns, threats to public officials, and acts of political violence. A far-right neo-nazi march, or some Irish men violently bullying a muslim man at prayer, or an U.S. American teenager vandalizing a Jewish religious site: these actions are filmed and re-proposed on social media, and the algorithmic amplifier does the rest.  Within the online parallel universe, the benchmarks for what is deemed “normal” or “mainstream” are shifting faster, and the consequences are serious, especially on over-45 digitally illiterate adults.


As part of the SMIDGE project (Social Media Narratives - Addressing Extremism in the Middle Age), I was involved in developing a database cataloguing the main characteristics of existing videos that promote extremist narratives online. Finding myself confronted exclusively with this type of content, I experienced a potentially harmful experience. I exposed myself to over 50 videos a day on X and other digital platforms to collect content expressing extremist narratives related to the far right, religious radicalization, anti-vax sentiments, and conspiracy theories. This experience led me to delete my X profile, as by that point the algorithm was offering me nothing but hateful videos and content expressing strong opposition to institutions, as well as to certain social, religious, and ethnic communities.


Many scholars speak of Mutual Shaping, the effects of mutual influence between society and algorithm, the latter acting as an amplifier of current opinions in society. During the data collection, I often wondered why a certain video would be classified as extremist, so much had I internalized and normalized exposure to such content. However, my strong foundation in digital literacy allowed me to analyze it critically and minimize its impact on me.


Unfortunately, this is not simply a problem of fake news. Disinformation is rarely the end goal; more often, it serves as a way to introduce narratives that erode trust in institutions, delegitimize democratic processes, and oversimplify complex social realities, framing them in the classic “Us vs. Them” dichotomy. The question is not whether platforms like “X” determine social reality. They do. The most pressing question is whether companies or actors who hold the instruments to change this reality are going to counter this trajectory of normalizing the absurd.


Without a change in the policies and regulations of the system, radicalization will only lead to further social divisions and political, social, and ethnic polarizations, with the serious risk of irreversible damage to democratic systems as we have always known them. When a dangerous narrative becomes socially entrenched, whether far-right, anti-immigration, anti-Islamic, or anti-Semitic, corrective information will struggle to gain traction. Transparency of algorithms, consistent application of rules, and investments in moderation are not the ultimate solution, yet they are important interventions to counter the urgent problem.

 

 


 

Asllan Zenunaj is a Research Fellow at KCSS. He holds a Master’s in International Relations and European Studies, with a thesis on Kosovo’s security and NATO’s KFOR mission. Interested in security and Kosovo’s EU integration, Asllan joined KCSS through the Erasmus for Young Entrepreneurs programme to gain entrepreneurial skills.  Over six months, he contributed to KCSS's programmatic and research activities, including the development of the SMIDGE project’s Database featuring key characteristics of online videos associated with extremist (SMIDGE Database).


Comments


EN_FundedbytheEU_RGB_NEG.png

Grant Agreement Number 101095290

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Research Executive Agency (REA). Neither the European Union nor the granting authority can be held responsible for them.

UK participant in Horizon Europe Project SMIDGE is supported by UKRI grant numbers 10056282 (De Montfort University).

CONNECT WITH US

  • LinkedIn
  • X
  • Youtube
  • Grey Black Simple Fashion Instagram Post-3_edited

SUBSCRIBE TO OUR NEWSLETTER

Thanks for submitting!

© 2026 Smidge Project

bottom of page