From: mk_thisisit

Journalist Max Fisher, nominated for the Cersa Pool Award for his 2019 investigative report on how social media algorithms affect global chaos, discusses the profound and often fatal impact of social media in his book, “In the Modes of Chaos” [01:13:26]. He challenges the initial skepticism regarding social media’s capacity for harm, asserting that after years of study, he has come to believe that social media can lead to the death of people [02:07:06].

Social Media as a Catalyst for Violence and Hatred

Fisher initially viewed technology as less serious than international conflicts [06:58:24]. However, while covering events like the genocide in Myanmar and community conflicts in various countries, he consistently observed social media fueling racial hatred and driving political polarization [07:06:21].

He likens social media’s impact to that of drugs, stating that while it doesn’t kill people directly like cigarettes causing cancer, it leads to death indirectly through the events it causes [01:54:19].

“I started to believe even though I was initially very skeptical about the idea that social media can lead to the death of people through what it causes” [02:01:23]

Fisher points to specific instances where a significant number of people died as a result of racist or other violence triggered on social media, which he believes would not have occurred without these platforms [02:49:15].

Examples of Social Media’s Harmful Impact

  • Myanmar Genocide: Facebook researchers reportedly warned about the platform’s role in the genocide in Myanmar [03:58:19]. They received repeated warnings from people on the ground that Facebook was promoting specific conspiracy theories, hate speech, and religious/racial incitement that would not exist without the platform’s interference [04:17:15].
  • Sri Lanka Violence: Violence in Sri Lanka is cited as another case where social media contributed to significant harm [02:44:11].
  • Communal Violence in India: Similar to Myanmar and Sri Lanka, communal violence in India was fueled by social media [02:45:51].

The Role of Algorithms and Human Psychology

Max Fisher explains that social media platforms are designed with algorithms that select what users see, how they see it, and in what order [05:08:18]. These systems are specifically designed to maximize user time on the platforms to generate more revenue [05:16:34].

“Because of how our minds work how human nature works in certain contexts the things that are most stimulating to us and cause us to spend the most time on the platform is fear of others a sense of us versus them conspiracy theories hate conspiracy fear sleep a sense of social outrage collective outrage” [05:28:01]

This design intentionally promotes content that evokes fear, “us vs. them” narratives, conspiracy theories, and social outrage, as these emotions are highly engaging and keep users active [05:39:15].

The News Feed Case Study

The introduction of Facebook’s News Feed in 2006 serves as an early example of how these mechanisms work [10:59:17]. Initially, some users disliked the News Feed due to privacy concerns, leading to the formation of “anti-News Feed” and “anti-Mark Zuckerberg” groups [12:40:24]. When someone joined these groups, it appeared in their friends’ News Feeds [12:53:14]. Because social outrage is a powerful emotion that grabs attention, many people clicked “like” or joined these groups, creating an “illusion of the majority” that everyone shared the same opinion [13:00:23]. This phenomenon led to real-world protests in front of Facebook offices [14:19:35].

Mark Zuckerberg recognized that this outrage drove traffic to the site, leading to News Feed’s permanence and a significant increase in Facebook’s engagement [14:34:03]. This early experience demonstrated how social media could generate massive disinformation and false perceptions of widespread anger by exploiting human psychology [15:14:50].

Corporate Responsibility and Lack of Action

Max Fisher argues that while the creators of these platforms, like Mark Zuckerberg, may not intend to cause harm, their companies bear responsibility for the deaths that occur due to their platforms [03:11:06]. He believes Zuckerberg genuinely thought he was creating something helpful [03:36:34]. However, the people managing these companies had repeated evidence from their own researchers that the platforms trigger behaviors leading to violence and death [03:42:07].

“The fact that Mark Zuckerberg knew about it had evidence that people warned him about it even in specific cases saying that it would kill thousands of people and he did nothing that doesn’t make him a murderer but in my opinion makes the companies responsible for the deaths of the people that they didn’t save” [06:01:21]

Social Media’s Impact on Elections and Politics

Social media has had a significant impact on presidential elections, making it impossible to accurately quantify the number of votes influenced by these platforms [17:19:15]. The effects are dispersed, subtly shifting individual users’ tendencies towards polarization, conspiracy theories, or other views [17:33:04].

The Cambridge Analytica scandal and the 2016 US presidential election highlighted the role of algorithms [16:52:16]. Studies after the 2016 election showed that platforms like Facebook and Twitter artificially amplified far-right news sources such as Breitbart News, which focused on conspiracy theories and supported Donald Trump [18:04:10]. When algorithms were changed, Breitbart News’ audience on these platforms significantly decreased, demonstrating that its reach was artificially amplified by algorithms designed to maximize engagement [18:24:26].

Addiction and the Power of AI

Social media applications are intentionally designed to be difficult to put down [09:33:07]. Companies like Facebook and Google (which owns YouTube) have hired top minds in artificial intelligence and computer programming to make users addicted to their platforms, a goal in which they are highly effective [09:35:05].

Regulation and the Future

In Europe, there is an effort to introduce regulations like GDPR and new regulations regarding artificial intelligence [21:01:13]. However, in the United States, there is no federal regulation on social media [21:14:48].

Fisher notes a growing consensus that current regulations, often focusing on content moderation, are insufficient [21:38:09]. The problem lies in the inherent design of platforms that artificially promote divisive and hateful content [21:58:10]. The only effective way to regulate, he believes, is to address the design of the systems themselves, particularly the algorithms and content promotion mechanisms [22:08:48].

While social media has exacerbated democratic crises and political polarization, Fisher emphasizes that these trends started before social media became prevalent [29:59:04]. However, social media undeniably accelerates and worsens these issues by shaping public perception and identity [30:08:24].