Recently, I was watching a short video on Facebook, but something sinister happened after the video ended. The next video on the autoplay was a bizarre conspiracy video against some famous mainstream Muslim scholars and speakers, with a number of fallacies carefully mixed in with just enough facts to sneak under the radar of a casual viewer.
I immediately knew what was going on—the “watch next” algorithms were dangling different types of bait to see what I would “bite”. I confess, due to that rubbernecking instinct (when you stare at a horrible accident scene whilst knowing you shouldn’t) within all of us, I was hooked for the first few minutes before I snapped out of it.
Scandals, lies, gossip, conspiracies, claims of “hidden knowledge”, melodrama, controversy…these are just a few things that prey on some of the most primal instincts of the nafs. An entire economy exists not only to exploit these instincts but automate them to a level of throughput never before imaginable.
This economy is building a new type of Islam, too, with its own madhhab. Or, to be more precise, a billion different madhāhib perfectly tailored to aspects of our personalities that even we were not aware we had. To appreciate why this is different to anything we have ever experienced in Islamic history (and indeed human history), we need to look at a few things.
A new economic order
Human economies have gone through many phases, from utilising animal and human labour, to fossil fuels, and to computing power, all with various underlying logics. Perhaps the most comprehensive exposition of today’s reigning economic logic is what some thinkers like Shoshana Zuboff call surveillance capitalism. In her magnum opus, The Age of Surveillance Capitalism, Zuboff defines this as, among other things:
“A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction and sales.”
One thing this means for us browsing the web and scrolling our endless newsfeeds is that the primary resource of today’s economy is not necessarily information (of which there is no scarcity), but our attention. That is the scarce resource that companies are competing for.
So what? Hasn’t this always been the case?
No doubt, when the radio was invented, people probably complained that it was having an adverse impact on people, families and society at large. When the TV was invented, some probably decried the fact that no one was listening to the radio anymore. The same goes for telephones, cars, and pretty much every new invention. Today’s Internet-enabled devices are only the latest in a long line of such inventions, right?
Wrong. I’m no Luddite; as I type this on my state-of-the-art mechanical-switch Bluetooth keyboard I literally have half a dozen “smart” devices on my desk. But there is one very important difference we need to realise between smart devices today and inventions of the past.
1. They have one job
Behind every successful app, game, social network, and online service are teams of attention engineers who utilise centuries of accumulated wisdom in psychology, neuroscience, and persuasion technology in order to do one thing—catch a bigger and bigger slice of your attention in order to increase the one metric that matters: time spent on site.
So what? Haven’t lots of people, from advertisers to proselytisers, been competing for our attention forever? The unique thing about today’s attention engineers is that they are unbelievably good at it. This is because they have more insight into your personality—what makes you tick and click—than almost anyone else in history.
2. Vast psychological questionnaire
Psychologist and data scientist Michal Kosinski highlighted this in a very simple experiment utilising only bit of publicly available information: people’s Facebook likes. It is quite unremarkable that knowing what someone “likes” on Facebook will give you some insight into their personality. However, the results of Kosinski’s experiment were shocking.
Kosinski’s team found that with knowledge of only 10 Facebook likes (which were publicly available), they could predict aspects of an individual’s personality better than that individual’s average co-worker. With knowledge of 150 of those likes, they could predict their personality better than their own mother. And with 300 likes, better than their spouse! Kosinski concluded:
“Our smartphone is a vast psychological questionnaire that we are constantly filling out, both consciously and unconsciously.”
It is important to note that Kosinski’s team barely scratched the surface—this was just publicly available information about you and me. Facebook itself gathers far more than that: what you like, what you dislike, what you watch, how fast you scroll past certain things, the people you know, your location information, your purchasing habits, the contents of messages and comments between your contacts, and much more.
3. Humans need not apply
The icing on the this-is-like-nothing-else-in-history cake is that this entire process is now run almost independently by artificial intelligence systems, with machine learning algorithms that teach themselves to get increasingly better at one thing: increase time spent on site.
What kind of world does this create? Companies compete for our attention and are willing to do whatever it takes to get more of it. Those who set up these services may have had good intentions, but those responsible for serving us content on mainstream online services and social networks are no longer human beings restricted by certain values, ethics, and human empathy. They are instead automated algorithms. These systems have access to information about our most intimate personality details that we ourselves probably are not aware of, and these systems are geared towards making money out of you—the product—by teaching themselves to get better over time.
We have written about some of the impact this is having on our mental health and particularly that of the younger generation. However, to appreciate how this is fundamentally changing how Islam is understood and spoken about—hence a “new madhhab”—two more experiments provide a great illustration.
1: “How many refugees committed crimes in Europe?”
Filmmaker Max Stossel asked two of his friends this question. Each of them did what most of us would probably do—they Googled it. Despite both having the same input, they got radically different outputs from Google. One friend got a mixture of reputable news outlets and surveys stating that refugees pose no specific threat to them statistically. The other friend got a radically different set of results, ignoring statistical significance and over-representing the threat that refugees pose. Same input, different output. Stossel remarked:
“These people are essentially living in two different worlds. They have different versions of what reality is.”
When a Muslim Googles something about Islam, it’s no surprise that their previous background and school of thought play a part in what results they get. If anything, one might find the inherent personalisation in these search engines useful in that regard. However, there is a more sinister consequence than this algorithmically controlled space simply reinforcing your previously held views.
2: “You are never extreme enough for YouTube”
The second experiment is what the technosociologist Zeynep Tufekci discovered with YouTube’s highly successful proprietary algorithm for its “watch next” feature, which has kept even the most disciplined of us glued to the service for hours on end at some point in our lives. In the last US presidential election, Tufekci searched for a Donald Trump video on a blank machine and let the Autoplay feature run to its heart’s content. The first video was followed by—unsurprisingly—another Trump video, then another, then another. Tufekci noticed, however, that the videos became progressively more right wing and more extreme, until only a few videos later she found herself watching full-blown far-right fascist videos.
She tried the same with Hillary Clinton on a blank machine. The first video was followed by another on Clinton, then one on Bernie Sanders, then another. The videos became progressively more left wing, until she ended up in absurd left-wing conspiracy theory videos. She even mentioned that after watching a few videos about vegetarianism, she was offered videos on veganism! She remarked:
“It’s almost as though you’re never extreme enough for YouTube’s algorithm.”
Tufekci found that YouTube’s famous recommendation engine will not simply serve you content that you like, but more specifically content that you will watch.
What’s the difference? We will watch things that we don’t like, which we in fact hate. Things that make us angry tend to increase that one desired metric, “time spent on site”, like nothing else. We will watch things that push our personal boundaries and make us explore things for no logical reason other than that they are new and shocking, or give us a sense of having access to some special “insider” knowledge that gives us some kind of social capital to feel superior to others.
If any of this sounds familiar, it is because, in short, these algorithms have “learnt” how to exploit every weakness of the lower primal nafs that we have been warned against in our dīn.
Algorithmic Islam – the “new madhhab”
Ask yourself what your most memorable experience was of “Islamic” content or conversations online recently. Chances are it was not the most beneficial to your character or worship, nor the purification of your nafs or the most fruitful in the Hereafter. Instead, it might have been something shocking, controversial, or downright infuriating.
It was likely not the same discourse and emphasis that Allāh and His Messenger (sall Allāhu ‘alayhi wa sallam) gave us, but a highly regulated selection of Islamic topics that happened to have become viral. This is perhaps because they are topics of debate and refutations, or the small differences that distinguish “our group” from “their group” rather than the overwhelming majority of Islamic character, rulings, and spiritual growth that is common to all Muslims. Perhaps if we are slightly more fortunate (or careful), it would have been content that gives us instant “Īmān boosts” but not necessarily directs our attention towards a structured programme of study, spiritual growth, and personal development.
It could have been refutations against person X, or gossip about the alleged sins of person Y, or the deviant person Z who disagrees with your opinion about X and Y. Or it could have been those “juicy” Islamic topics that we are disproportionately shown and more likely to click on rather than the “less exciting” topics pertaining to long-term hard work that doesn’t give us an instant dopamine hit. These things add up and end up—if we are not careful—replacing Islam’s own discourse concerning where to direct our attention and the manners involved in understanding that discourse.
YOU have a duty
This is all not to say that the people involved in these industries are evil—from the attention engineers and social media magnates, to the conspiracy theorists and millenarian vloggers that may genuinely have a messiah complex to save the Muslims from Dajjālic plots. It could all just be a perfect storm of all the “right” nafs problems combined with sophisticated technology and automation within our new economic logic.
We each have a duty to be very careful, because we are not simply consumers of content–we are also actively involved in promoting anything we give our attention to. Anything you watch in algorithmically controlled spaces is more data to feed back into the automated system. If you ignore something that is important but not “hot” enough, the algorithms will show that to fewer and fewer people. If you give your attention to harmful and controversial content, the algorithms will show it to more and more people; you are helping it go viral with every angry comment, view, and refutation.
Be careful of anything that the algorithms serve up to you (or even the content you actively go looking for yourself). Question why you are fascinated by it. Is it good for your long-term benefit and development, or is it stirring up hatred, outrage, or scandal in you?
Be particularly cautious of anything edging you anywhere near any of those limits sanctified by Allāh: the blood and reputation of the Muslims. Most of us are not in a position where going down this route will lead us to shed blood Alhamdulillāh—unlike rival groups in war zones, for example, that sadly turn on each other now and then. However, all of us are in a position where the honour and reputation of Muslims could become cheap, so anything taking shots at Muslims by name—even if they have a sophisticated apologia justifying refuting “deviants”—you would do well to stay clear of it. You don’t want to be responsible for having that inviolability on your neck on the Day of Resurrection. It’s not worth the risk.
Always remember that the algorithms that show you that stuff to keep you hooked only care about increasing “time spent on site” in the immediate future. The algorithm does not care about truth per se, nor your long-term success in the dunya and ākhira. Make sure you commit to the “boring” and “difficult” yet important topics and development programmes chosen by you yourself or those who give you tarbiya, and not automated algorithms. This is how we can collectively preserve the Islam that was handed down to us by human beings from the Best human (sall Allāhu ‘alayhi wa sallam), and prevent these artificial intelligence systems creating competing versions of Islam to sell ads.
Use the technology at your fingertips, but don’t let it use you.
 The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. 2019 Professor Shoshana Zuboff
Salman studied Biochemistry at Imperial College London followed by a PhD in Chemical Biology, carrying out research into photosynthesis. During his years at university he became involved in Islamic society da’wah and activism, and general Muslim community projects. He is the Chief Editor and a regular contributor at Islam21c, and also has a blog on the Huffington Post.