From: jimruttshow8596

Daniel Schmachtenberger defines sense-making as the basis of large-scale or collective choice-making, and its importance is tied to the ability to communicate effectively [02:00:00]. Humanity’s current inability to coordinate differently on issues like climate change or wealth inequality stems from a “peak bad sense-making” environment [01:57:00]. This deficiency in shared understanding across consequential issues leads to widespread disagreement, civil tension, and societal breakdown [03:09:00].

Evolution of the Communications Ecosystem

Historically, communication was characterized by limited broadcast channels, such as three major TV networks in the United States, from which a significant percentage of the population received their news [06:47:00]. While these traditional media outlets still involved manipulation, they provided a shared basis of information, allowing for common ground for agreement or disagreement, regardless of distortion [10:21:00].

The advent of the internet and social technologies transformed this landscape. Initially, the internet was seen as a tool for democracy, removing the monopoly of broadcast media and allowing for the emergence of “best ideas” [12:13:00]. However, this “naive hopeful thing” [12:28:00] led to an explosion of content, making curation by algorithms central [12:34:00].

A significant phase change in the economics of online information occurred around 2004-2005 [17:41:00]. Prior to this, most quality online information was paid for, aligning the interests of service providers with users by delivering value efficiently and getting them offline quickly [18:00:00]. With the rise of platforms funded solely by advertising, the business model shifted to maximizing user engagement and time on site [18:27:00].

The Role of Algorithms and Psychographic Models

Modern social media platforms like Facebook and YouTube employ powerful machine learning algorithms to curate content [12:43:00]. These algorithms create detailed “psychographic models” of individual users by tracking clicks, shares, and hover times, predicting what a user will engage with better than their own spouse [13:14:00]. The primary goal is to maximize “time on site,” often achieved by appealing to emotional triggers and cognitive biases [13:55:00].

This process leads to:

  • Increased Bias: Content curation reinforces what users already know, doubling down on existing biases [13:33:00].
  • Emotional Hijacking: Information that evokes fear or anger is prioritized, as it naturally commands more attention for “evolutionary protective reasons” [14:42:00].
  • Fractured Narratives: Users are exposed to increasingly fragmented “narrative camps” with less shared understanding [15:09:00]. This results in individuals becoming “more certain and more outraged while simultaneously being more wrong” [15:18:00].

Dopamine Hijacking and Addiction

The constant optimization for user engagement leverages dopamine, a molecule associated with motivational networks and “feels good, do it again” dynamics [22:06:00]. Just as hyper-palatable “fast food” exploits evolutionary preferences for salt, fat, and sugar, social media extracts “hypernormal stimuli” from human connection and information, leading to digital “addiction” [23:20:00].

“what fast food is to food and nutrition is the same thing that porn is to sexuality and relationship… take something that has a normal dopaminergic process associated with something that has evolutionary advantage for raising kids and bonding and etc and just extract the hypernormal stimuli parts devoid of any of the things that would actually be relevant to human life and the same is true for what social media relationships are to social dynamics and relationships in general” [23:56:00]

The business incentive to maximize “lifetime value of the customer” means addiction is a “straightforwardly profitable thing” [24:30:00]. This is compounded by the fact that these “photon mediated” dopamine hits are personalized and often introduced to children from a young age without the same controls as regulated substances like alcohol or tobacco [24:58:00]. The prevalence of addiction, whether chemical or digital, is an “inverse index” of a society’s health [26:28:00].

Consequences for Societal Coherence

The current information ecosystem fosters:

  • Lack of Shared Reality: Individuals can scroll their feeds for hours without encountering a single piece of common news, undermining any shared basis for conversation or democracy [11:09:00].
  • Internal Enmity and Tribalism: Increased polarization leads to in-group/out-group dynamics and “internal enmity” that is often stronger than animosity towards external forces [11:50:00]. This fragmentation fuels “civil breakdown” [11:42:00].
  • Erosion of Authority: The cacophony of voices has replaced traditional media gatekeepers, leading to a destruction of consensus about authority [07:01:00]. While this decentralization can be beneficial (preventing “best and brightest” from leading society astray, as seen in the Vietnam War [27:27:00]), it also leaves a void where “no authority” is replaced by “nothing so far” [27:58:00].
  • Spread of Virulent Misinformation: The platform’s affordances, initially designed for mundane advertising, are now used by non-state actors, state actors, and political actors to spread “fake news” that is intentionally high-impact and can travel “five or six times farther than true news” [21:10:00]. Examples include “anti-vaxxer” narratives and “QAnon,” which proliferate without the “gatekeepers” of previous eras to keep “absolute nonsense out of widespread public circulation” [11:46:00].
  • Confirmation Bias: In a world of overwhelming information chaos, individuals resort to “tribalism,” aligning their beliefs with their chosen “team” rather than objective assessment [12:45:00].

Addressing the Challenge

Solving the challenges posed by technology requires a multi-faceted approach.

Individual Level

  • Dubiousness of Certainty: Individuals should cultivate a “direct relationship with reality” [21:15:00] and be “dubious of being over devoted to any models of reality” [21:26:00]. This means questioning one’s own biases, emotional triggers, and group identities [21:40:00].
  • Epistemic Commitment: Develop a “bias checker” within oneself and learn how “narrative warfare,” “Russell conjugation,” “Lakoff framing,” and “cherry picking of data” manipulate perception [22:11:00].
  • Seek Dissenting Views: Actively seek out earnest dissenting views, even from those one disagrees with, to facilitate dialectical thinking and gain a “parallax” understanding of complex issues [23:01:00].
  • Mindful Media Consumption: Remove social media apps from phones to avoid continuous micro-targeting, and intentionally curate feeds to follow diverse perspectives rather than being passively used by the platform [24:01:00].
  • Responsible Sharing: Before sharing content, ask “is this actually good for the world to share this?” [25:10:00], recognizing that individuals are actors in shaping the epistemic commons [25:20:00].

Institutional and Systemic Level

  • “Sense-Making Institutions”: Create and support institutions that facilitate “dialectic conversations between the best thinkers on topics” [27:50:00] who are earnest and willing to engage in non-rhetorical, authentic sense-making [27:57:00]. These institutions would aim to clarify what is known, what is unknown, and the varying interpretations, without institutional bias [28:27:00].
  • Meta-News Analysis: Develop processes to assess the landscape of dominant narratives on highly polarized and consequential topics [30:21:00]. This includes:
    • Steel-manning: Making the best possible argument for each narrative to foster understanding across differing viewpoints [30:45:00].
    • Propositional Analysis: Breaking down narratives into individual propositions, evaluating evidence, and identifying verifiable, falsifiable, and conjectural elements [31:19:00].
    • Transparency: Showing the process of data analysis and epistemic models used to arrive at conclusions, empowering individuals to learn and apply these methods themselves [33:43:00].
  • Optimized Public Education: Integrate compressed “epistemic models” into education, teaching concepts like game theory, political theory, the scientific method, and the Hegelian dialectic to empower civic engagement and critical thinking [34:28:00]. This includes understanding “regulatory capture” and how power dynamics influence information [43:02:00].
  • Addressing Multipolar Traps: Implement “governance at the level that we’re having effects” [33:55:00]. This might involve mechanisms like “massive tariffs on implicit carbon” [35:35:00] to encourage global coordination without necessarily creating a single, corruptible world government [35:12:00].
  • Reconciling Free Speech in the Digital Age: While upholding free speech is crucial, the unprecedented viral spread of “not true and dangerous” information through algorithms necessitates rethinking boundaries, potentially focusing on “bad faith discourse” (speech that does not correspond to the speaker’s belief) as a form of “pollution to the mimetic sphere” [36:26:00]. However, policing intent is difficult, suggesting a shift towards increasing “collective intelligence” to recognize and filter such discourse [37:59:00].

Ultimately, the goal is to foster a “cultural enlightenment” that encourages better individual and collective sense-making, higher quality conversations, and a “memetic immune system” against manipulation [39:33:00]. This recursive process, where better individuals foster better systems, and vice versa, is seen as the only way to counteract the current trajectory towards autocracy and ensure the continued viability of participatory governance [43:50:00].