Social media: time to turn around the weapons

by Manal al-Sharif | Nov 6, 2021 | Business

For all the benefits wrought by the information revolution, social media has become a tool of dictators. Its profit-driven, surveillance-based business model favours those with deep pockets and deadly motives. In the first of two articles, cybersecurity expert and human rights activist Manal al-Sharif suggests way to invent the digital world for the better.

In 2021, two women have shaped the public discourse around social media. Frances Haugen, the ‘‘Facebook whistleblower’’, gave the world evidence of what researchers had long been warning about. Then, in the Philippines, Maria Ressa, a journalist who investigated for years how politicians are weaponising Facebook to undermine democracies, won the Nobel Peace Prize. They and others present an urgent case for reclaiming agency over our digital lives.

According to Freedom House, only one in three of us worldwide live under a democratic political system – meaning that two-thirds of the world are either living under authoritarian conditions or moving towards them. 2020 marked the 15th consecutive year of declining global freedom and rising authoritarianism. 2020 also marked 14 years since the opening of Facebook to the public and 13 years since the launch of the iPhone. Is this a mere coincidence? While the proliferation of smartphones and social media is often lauded for simplifying our access to information, a closer look reveals a more complex picture.

Digitised lives without digital rights

During the Covid-19 lockdowns in 2020 and 2021, the world spent a vast amount of time online. Almost every aspect of our lives was digitised, from the classroom to the doctor’s clinic. Around this time, a close family member of mine announced that dinosaurs were a conspiracy to sell cinema tickets and that flat earth theories were worthy of consideration. To be clear, this was not a young child but a grown adult engineer who has lived in five countries and speaks four languages. Unfortunately, in the ‘‘fake news’’ era, it cannot be assumed that anyone has the tools, to tell the truth from the lies.

Around that same time, one of my close friends in the US told me she was losing her father to political conspiracy theories and misinformation. “He’s been radicalised,” she said. “I don’t recognise him anymore. He’s been consumed by anger and hate.”

“A lie told a million times becomes a fact”

It is not just my friends and family who are succumbing to falsehoods. Maria Ressa, a Filipino-American journalist, uses the above quote by Joseph Goebbels to demonstrate how some politicians are abusing social media in the Philippines, where some 97% of the population have a Facebook account. 

In her investigative journalism work, Maria exposed the manipulation of social media platforms by savvy actors for the purpose of undermining democracies, harassing opponents, spreading misinformation, and faking the popularity of political actors. She is primarily known for her criticism of Facebook, which she has dubbed “the murderer of democracy”.

“Facebook was weaponised, and the first people who were attacked were journalists. I watched democracy crumble in about six months in the Philippines.”

Maria Ressa – Nobel Peace Prize winner

In 2021, Maria was awarded the Nobel Peace Prize in recognition of her efforts to safeguard the freedom of expression. As a fellow activist, I consider this an important acknowledgment by the global community of the modern dangers of standing against oppression facilitated by tech. You risk sacrificing your reputation, income, friends, and freedom, not to mention your mental health.

The science of a viral post

Research has found that content that goes viral is partially driven by physiological arousal. Content that evokes high-arousal positive (awe) or negative (anger or anxiety) emotions is more likely to go viral. Conversely, content that evokes low-arousal or deactivating emotions (e.g. sadness) is less likely to do so.

As such, any post that provokes our lizard brains to react will be shared. When we add in the tendency to hit ‘‘Share’’ before fact-checking, we see the ‘‘social proof’’ phenomenon – the assumption that others’ actions represent the correct response to a given situation – play out before our very eyes. And when we throw in platforms that prioritise growth – of users and time spent on them – we end up with a recipe for disaster.

Social media platforms have an undeniable conflict of interest because they prioritise growth over all else. Facebook’s engagement-based ranking incites misinformation, hate speech, and even ethnic violence. The machine-learning models that maximise engagement also favour controversy, misinformation, and extremism.

Sock puppets and propaganda wars

Unfortunately, the current profit-driven, surveillance-based business model of social media platforms gives the advantage to those with the deepest pockets and the greatest understanding of how to exploit this nefarious potential.

Facebook knows us better than our mothers do. Of all the data a business can legally collect about us, Facebook collects 80%. Separately but no less worryingly, about one in five US adults say they get their political news primarily through social media.

Although Facebook requires a verified identity – i.e. a real name and email address – to open an account, it has a loophole that can be successfully exploited. Any account can create ‘Facebook pages’ and use those pages as if they were personal profiles. This means they can use those pages to ‘‘like’’, ‘‘share’’ and ‘‘comment’’, expanding the reach of their preferred content. 

As part of Maria Ressa’s effort to expose how social media has been weaponised, she published a series of articles on her news site Rappler entitled ‘Propaganda War: Weaponizing the Internet.’ She and her team analysed thousands of Facebook accounts and came to the conclusion that a ‘‘sock puppet’’ network of 26 fake Facebook accounts was influencing nearly 3 million other Philippines-based accounts. Behind the sock puppets were three so-called super trolls. These accounts form the perfect virtual mob. They can spread rumours, ignite hate speech, target opponents, and drown out legitimate content by flooding the network with nonsense trends. Such mobs can also create an illusion of popularity for specific individuals by posting praise, liking, and sharing.

Some say that the Facebook Files revealed nothing new, but this is not true. Previously, most independent research warning of the use of Facebook to interfere in political systems or harm mental health didn’t result in a finding of interference. On the contrary, Facebook was able to shut down many of those independent researchers without repercussion. Now, the leaked documents confirm what the researchers have known for years. They also exposed how Facebook executives were sitting on internal research that proved the harms of their platform, and chose only to fix the parts that could impact their reputation. One of the shocking facts was how 87% of Facebook’s misinformation budget was dedicated to English-speaking content, merely 9% of the whole population of Facebook, leaving countries such as the Philippines, Myanmar and Ethiopia to combat the problem alone.

Today, humanity is facing not only a climate crisis, but a digital crisis about which advocates, activists, and independent researchers have long been sounding the alarm. But, so far, they have mostly been met with doubt and disbelief. Why?

Part of the reason is the normalisation of the “privacy is dead” rhetoric by tech giants. This commodifies the human experience and leaves us feeling that resistance is futile. Tech is both the largest global industry and the least regulated, allowing tech companies to shape our future and our planet with little to no input from citizens and policymakers. With almost all of us now using social media, we face a critical juncture to assert our digital freedoms. I know more than many how crucial these freedoms can be.

A digital future where we are stripped away from our agency is a future that needs to be resisted on all accounts, or we all risk living in a world very similar to Maria’s world as described in her own words: “our dystopian present, is your dystopian future.”

Check out the Tech4Evil podcast here.

Manal al-Sharif is an author, speaker, human rights activitist and a regular contributor to international media. She has written for the Time, the NY Times and Washington Post. Her Amazon bestseller memoir, Daring to Drive: a Saudi Woman's Awakening, is an intimate story of her life growing up in one of the most masculine societies in the world.

Manal is a cybersecurity expert and host of the tech4evil.com podcast that discusses the intersection of technology and human rights.

Don't pay so you can read it.

Pay so everyone can.

Pin It on Pinterest

Share This