Misinformation, Part 3

Algorithms, Righteousness, Truth, Addiction, and Rabbit Holes

Just a heads up—this is the third segment of an essay on Misinformation. Just in case you missed it, here’s the introduction, Part 1 and Part 2.

In Part 2, I spoke about my second mind-blowing moment—AKA this quote by Noah Yuval Harari: “When you flood the network with information, the truth sinks to the bottom.”

It spurned a lot of questions. Namely: why?

**

Thread 6: Algorithms

First, what's an algorithm? For me, the word evokes math formulas (yikes).

In social media land, an algorithm is a set of rules and instructions written in code. So when reading Harari's new book, I was surprised to see it categorized as Artificial Intelligence (AI)—a very primitive AI, but AI nonetheless. Here's a story to explain:

In 2013, Facebook gave its algorithm a clear mandate: increase engagement. (“Sounds great, doesn't it,” quipped Harari in one interview, "Who doesn't want to be engaged?")

Here's where the AI part comes in: AI is software that can learn and make decisions by itself. So when Facebook's algorithm got the brief, it went to work: running millions of millions of tests on millions and millions of users to figure out how.

Apparently, if you're a passive social media user (like me) the algorithm notes how many milliseconds you spend on every post.

But what worked—aka increased engagement—wasn't necessarily true.

In Myanmar (formerly Burma), Facebook's algorithm promoted a conspiracy theory blaming the Muslim minority for the country's ills. The theory was false, but it pressed people's fear, hate, and outrage buttons—prompting them to watch, comment and share—so the algorithm spread it far and wide.

No one at Facebook spoke Burmese, and they had no idea what was being promoted. True to its AI nature, the algorithm made these decisions all on its own—with disastrous consequences. Tensions had existed in Myanmar for years, but the promotion of this particular conspiracy set fire to the kindling—resulting in genocide.

So why does “truth sink to the bottom”? Part of it has to do with social media algorithms. Facebook—and other companies—want to keep people on their platforms. More time on site equals more profit (in the form of advertising dollars), and this means whatever "increases engagement" spreads—regardless if it's true.

So, what does increase engagement?

**

Thread 7: Righteousness

“Social media looks like a window, but it's actually a mirror.”
—Josh Szeps

Reading about algorithms reminded me, once again, of the elephant and the rider (a metaphor for how we make decisions), which I talked about in Part 1.

In short, the elephant is the older, more intuitive part of the brain, and the rider is the newer, more rational part. Though they work together, generally speaking, the elephant is in charge.

The rider—our ability to rationalize—evolved, in part, to justify the elephant’s decisions to other people: to “justify ourselves and defend the teams we belong to." A team, in this case, is any group of people with something in common, like family, country, or ideology (i.e. the liberals and conservatives).

In Part 2, I hypothesized that this was part of our preference for partisan news. If our ability to rationalize is tied to our tribal nature, why wouldn't we prefer news that does just that?

Sidebar: Following Vancouver's decision re: natural gas, I found an article saying this would make people poorer, despite the evidence shown at city hall. The author's rider seemed to be defending their team's position—no matter what.

This, in turn, prompts the algorithm to show us more of it. "The thing about [...] social media bubbles is that they aggressively feed and amplify repetition," says Elif Shafak. And repetition can easily be mistaken for truth.

**

Thread 8: Truth

There are other reasons “the truth sinks to the bottom.”

#1—It's Expensive

If you want to write a (truthful) book about the Roman Empire, you’d need to invest a lot of time and resources. But if you're less concerned about accuracy, you can start with your preconceptions and fill in the rest.

When looking into climate-related misinformation online—a topic I'll dig into in another essay—I noticed much of it was based on truth. Often, the author had gotten part of it right but filled in the rest—leading to questionable conclusions.

**

#2—It's Complicated

Harari defines the truth as "that which does a pretty good job of representing reality." But reality is complicated, and our brains prefer simpler stuff.

I struggled with this when working on the first part of my DIY master's “Changing Minds at City Hall.” Even my best attempt at a one-sentence summary (“whether using natural gas for heat and hot water in new, yet-to-be-built buildings would reduce the cost of homes”—whew!) was a mouthful.

Social media is built for soundbites, and, to top it off, attention spans may be getting shorter. Recently, I got diagnosed with ADD (not once, but twice—in Canada and the Netherlands); the more time I spend on attention-fracturing sites like social media, the worse it gets. In an attempt to manage it, I use an app to make it impossible to keep scrolling.

Even if we manage to push through and spend time reading and researching, we may never see the full picture. As mentioned in Part 1, global issues—like the ecological crisis—require mounds of data to be fully understood. Thinking about all this humbled me, and I'm going to try to memorize this phrase for the future: “I don't know enough to have an opinion.”

**

“It's expensive” and “it's complicated” are two reasons that “the truth sinks to the bottom.” But they’re not the only ones.

**

Thread 9: Addiction

When I first started reading about dopamine, I thought it had to do with rewards. Not quite: it's the “more” molecule, as in, “I want more.”

In a world of scarcity, a dopamine drip kept us going when we looked for food or water. Every time we came across a clue—say, the flower of a plant with edible roots—we got a bit more, which kept us engaged until we found it. We also get dopamine when learning something or seeking a solution, which also help with survival.

But nowadays, in a world of plenty, our effort/reward balance is off. We get a ton of dopamine we didn’t work for from things like junk food, gambling, and social media. And the more dopamine (relative to effort), the more likely something is to become addictive.

A quick but fascinating side note—after a dopamine hit, our brains run a kind of “post-mortem,” to evaluate the effort-to-reward ratio. This becomes a benchmark for how much effort we’re willing to invest in the future. If we frequently engage in low-effort, high-dopamine activities (like scrolling), it can make higher-effort, lower-dopamine tasks (like cooking, studying, or cleaning) feel even harder.

Earlier, I mentioned Myanmar. In that (tragic) case, what got people engaged—from a social media perspective—was fear, hate, and outrage. These emotions also give us dopamine, which keeps us scrolling in the search for a solution.

To top it off, some of the world's best minds in behavioural psychology and cognitive neuroscience have been working on making it even more so.

“[Slot machines leverage [...] psychological weakness to incredible effect,” says Max Fisher in The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World. “The unpredictability of payout makes it harder to stop. Social media does the same.”

**

Thread 10: Rabbit Holes

Look around in your own life. Have you ever noticed that those with the most extreme views tend to spend a lot of time online?

Much like a drug, what presses our buttons on social media stops working after a while, and the algorithm has to up the ante to keep us hooked. It’s called “The Rabbit Hole Effect.”

My husband's cousin is a boxer. "When I look up fights on YouTube," he said, "I'm three clicks away from the most sexist bullshit you've ever seen."

Learning this reminded me of Elon Musk. Ages ago, I'd seen an old interview (pre-hair transplant) where he described his drive to do good. Not much later, he took on Telsa. Go Elon!

Since I'm not online much (see: ADD, above), I missed the memo on Musk. A couple of years ago, a Tesla owner told me he wanted to sell his car for fear of being associated with the brand. I was totally confused. “Why?”

From a far (far) distance, it seems Musk has gotten hooked on Twitter (now X). His biographer, Walter Isaacson, has said the same, and that when Musk was in “dark, demon-mode, late night, on Ambien and Redbull,” he'd embrace "some wacky [...] conspiracy theory.”

According to the Rabbit Hole Effect, one can start out being mildly skeptical about, say, climate policy, to then end up thinking climate change is a conspiracy altogether. Not good.

**

Conclusion (For Now)

Slowly but surely, what happened at city hall started to make more sense. In short, people are vulnerable to misinformation because:

  • We have biological tendencies to “justify ourselves and defend the teams we belong to.”

  • The places we go to learn about the world—like social media—are set up to exploit these (and other) tendencies to “increase engagement.”

When I first started on the topic of Media Misinformation, I suspected fossil fuel companies were to blame. But though they—and others who benefit from the status quo—fund groups that counter the climate science, it’s more complicated than that. Many “elephants” (to borrow from “the elephant and the rider” metaphor) resist change. And the riders will do whatever they can to find a reason not to.

**

There was so much more that I wanted to cover in this essay—for example, the possible solutions.

But “Media Misinformation” was too tough a topic to wrangle in one (even two) months’ time. I’ll have to save my notes for another day.

On a plus side, there’s a lot I learned—about writing and life—this time around.


Part 4

Reflections on Writing, Path, and Purpose

What do you think?

Thoughts, ideas, and suggestions welcome.