Another way social media changed the world
Proud Boys, ANITFA, QAnon, ISIS, Incel. The list of extremist communities is endless and continues to grow. These groups actively leverage social media for the purpose of radicalizing, recruiting and organizing believers. It has never been easier to make an extremist.
How does this happen? After all, social media is just a series of neutral communications platforms, lauded for the good it does. I looked to answer the question and found that not surprisingly the answer is complicated.
Structure to keep us clicking
While there are many paths to radicalization, social media shoulders blame for many of these roads, partly because of it’s structure.
The first issue in social media’s structure is flow. Flow is built into how these applications work. For example, YouTube’s associative linking has been designed to promote addictive behavior, to keep us clicking and watching. The repetition of clicking, viewing, clicking is perfect for indoctrination and the spread of misinformation.
Then there are the algorithms. On platforms like Facebook, Twitter, YouTube and even Amazon, algorithms influence what we see or what we don’t. Through algorithms, people’s biases are amplified and spread by what algorithms choose to deliver to our screen.
Additionally, algorithms are enabled by the recommendation engines which use collaborative filtering. Recommendation engines compare the profiles of others like us with our own and fill in the gaps. Because of this process, the biases we already have and the biases of similar people influence which content algorithms show us on the screen.
If algorithmic recommendation brings us content that reflects how we already think, we are not only more likely to engage with that view, we are more likely to spread it. Kris Shaffer best explains it in a TPM article:
This sharing optimization compounds the filter bubble effect. Because it is easier to find information that reflects my existing biases and easier to share it, my contributions to others’ social feeds will reflect my biases even more than if I only shared content that I found elsewhere on the internet. And, of course, the same is true for their contributions to my feed. This creates a feedback loop of bias amplification:
Left unchecked, this feedback loop will continue to amplify the biases already present among users, and the process will accelerate the more people find their news via social media feeds and the more targeted the algorithm becomes. And given the way that phenomena like clickbait can dominate our attention, not only will the things that reflect our own bias propagate faster in an algorithmically driven content stream, but so will content engineered to manipulate our attention. Put together, clickbait that confirms our preexisting biases should propagate at disproportionally high speeds. (Shaffer, 2019, para. 9)
Added to this mix is the knowledge that discussion among like-minded people radicalizes their average opinion. While it has always been so, social media intensifies the process because groups are larger, with more sources of information and more chances of encountering extremists. It is easier for the discontented to seek out the like-minded and for radical recruiters to find them.
We also know that content which arouses any emotion increases the engagement of the viewer. Anger is especially powerful. In the unlikely event that the algorithms even show the viewer opposing content, anger makes it harder to listen to another side.
There has always been a myriad of elements to the making of a radical. However, social media has accelerated the process and can now reach larger numbers than ever before. Even after discussing only a few contributing factors, it is easy to see that through the use of social media, it has never been easier to produce extremism.
So, what do we do? Facebook, Twitter and many other social media platforms are now monitoring content and shutting down accounts. They have added advisories to questionable content. Is this effective? Not yet.
There is no single answer to the question of extremism. Social media can police the issue, but we also need to educate participants in analytical viewing. Where did this information come from? What are the facts backing it? Is it an opinion piece? Particularly we need to educate the young in how to be critical consumers of social media.
Unfortunately, I believe that neither of these actions are enough to mitigate the problem of radicalization. Are there other solutions I haven’t considered, or is extremism just a human condition?
Facebook: The making of a radical. Another way social media changed the world. Check out my blog: https://bit.ly/32UdSn9
Twitter: How does social media make a radical? Check out my blog: https://bit.ly/32UdSn9
Altman, G. Interaction Social Media [Photograph]. Pixabay,
Bolter, J.D. (2019, May 19). Social Media Are Ruining Political Discourse, The Atlantic, https://www.theatlantic.com/technology/archive/2019/05/why-social-media-ruining-political-discourse/589108/
Cohen, S. (2020, July 3). QAnon Is Disrupting America — Why Every Business Leader Should Be Concerned, Forbes, https://www.forbes.com/sites/sethcohen/2020/07/03/qanon-is-disrupting-america/#5c71d7a534a1
Luckert, S. (2018, January 26). Extremists Are Thriving On Social Media. How Should We Respond? Huffpost, https://www.huffpost.com/entry/extremists-are-thriving-o_b_14390260
Moskalenko Ph.D., S. (2018, July 6). Why Social Media Makes Us Angrier—and More Extreme, Psychology Today, https://www.psychologytoday.com/us/blog/friction/201807/why-social-media-makes-us-angrier-and-more-extreme
Moskalenko Ph.D., S. (2018, December 5). Mass Radicalization in the USA, Psychology Today, https://www.psychologytoday.com/us/blog/friction/201812/mass-radicalization-in-the-usa
Open Clip Art Vectors. Anti-fascism-Fight-Hate [Image]. Pixabay,
Shaffer, K. (2019, August 26). How Algorithms Amplify Our Own Biases And Shape What We See Online, TPM,https://talkingpointsmemo.com/cafe/algorithms-bias-internet