Media Misinformation, Part 2

Information Networks
& Inaccuracies

Just a heads up—this is the third segment of an essay on Misinformation. Just in case you missed it, here’s the introduction and Part 1.

In Part 1, I shared what I’d learned about “the elephant and the rider,” a metaphor for how we make decisions. The elephant is the older, more emotional part of our brains; the rider is the newer, more rational part.

Though the rider can forecast future events and list pros and cons, the elephant is mostly in charge. The rider—rationalization—evolved, in part, because it does something useful for the elephant: justify ourselves and the groups we belong to.

These days, partisan news does just that. When reading about the rider, I wondered if this was part of the reason people can be blind to bias. Being part of a team like the liberals or conservatives, Haidt says, “binds and blinds.” We have an evolutionary drive to defend the teams we’re a part of—and this blinds us to the good people (and arguments) on the other side.

Which brings us to…

**

Thread 4: Information Networks

The next mind-blowing moment I came across was from “Nexus” by Noah Yuval Harari. I can’t wait to tell you more—but first, some background.

Imagine societies are like information networks. In a democracy, there’s a back-and-forth, ping-pong of information; in other words, a dialogue.

Here, it’s important to note: information isn’t truth. It can also include misunderstandings, delusions, and outright lies. Truth is just a small subset of information, which Harari defined this way: that which does a pretty good job of representing reality.

Sidebar: Harai says no information is truly up for the task. He uses a map as an example—if you set out to create something 100% truthful, you’d end up with a life-sized version at a 1-to-1 scale.

The invention of new information technologies—like the internet and social media—gave everyone the ability to create and share information. All of a sudden, we had access to more info than ever before.

Personally, I’m grateful to live in this age. If I want to learn how to do something—which, at the moment, is making friendship bracelets—someone has posted about it somewhere. And as someone with addictive tendencies, I can never get enough: when Google Earth came out, I spent a whole evening exploring the Sahara.

But all this information has a downside. Here’s the mind-blowing part:

“When you flood the network with information,” says Harari, “the truth sinks to the bottom.”

Many think that the cure to misinformation is more information. But in practice, Harari says, this just isn’t the case.

**

Thread 5: Inaccuracies

Harari’s quote reminded me of a conversation. Inspired to join a protest against fossil fuel subsidies, I was standing on the sidelines when a man passed me by.

“I hope you’re not listening to AOC,” he said, still walking.

I was confused. “You know, Alexandria Ocasio-Cortez,” he said.

“I know who she is,” I replied, “but I don’t get it.”

“Then why are you protesting?” he asked. I told him about the subsidies, and we chatted politely for five minutes.

“Well, talk to me in a few years,” he said, turning to leave, “when the world comes to an end.”

“Wait—what do you mean?” I asked, practically running to catch up.

“It’s what AOC said,” he said, “We have to cut emissions in half, or we only have a few years left.”

His earlier comments suddenly made sense. “Not quite,” I replied.

**

Once back home, I looked it up. In 2018, AOC spoke at MLK Now—here’s what she said:

“Millennials, Gen Z, and all these folks that come after us are looking up, and [they’re] like ‘the world will end in 12 years if we don’t address climate change, and your biggest issue is how are we gonna pay for it?’”

The clip spread like wildfire. Many were condemning: AOC was clearly hysterical, delusional, or downright stupid. “It’s hard to imagine that [AOC] literally thinks the world is going to end, Armageddon style—even if that’s how the story has been framed by her critics,” commented one article.

I was familiar with the “50% by 2030” target set out in the Paris Agreement. But this wasn’t because the world would burst into flame, New Year’s Eve 2029, if we didn’t meet the mark. Instead, it’s because once greenhouse gases are in the atmosphere, they stay—some for a long time. And the more greenhouse gases, the more likely we are to hit tipping points: dominos that, once knocked over, create irreversible, cascading effects—known as “runaway climate change.”

Who knows if AOC knew what she was talking about. If she did, it was an unfortunate misstep. The clip went viral and served as evidence that she—who later championed a Green New Deal—was delusional. In the comment threads, I saw a couple of people try to correct the mix-up. But their counterarguments didn’t take, and the truth sunk to the bottom.

When it comes to social media, it's not easy to issue corrections. Leading up to the debate at Vancouver city hall, one of the councillors said, "Vancouverites deserve to choose how to cook their food." Vancouver wasn't going to ban natural gas for cooking—just for heating and hot water in new, yet-to-be-built buildings—but the comment stuck. It was one of many inaccuracies that—once released into the wild—multiplied like rabbits.

**

To me, all this clicked with one of Haidt’s concepts, which I talked about in Part 1.

“If you think that reasoning is something that we do to figure out the truth, you’ll be constantly frustrated by how foolish, biased, and logical people become when they disagree with you. But if you think about reasoning as a skill with humans evolved to further our social agenda—to justify our actions and defend the teams we belong to—things will make a lot more sense.”

More and more news—much of it shared on social media—serves to “defend the teams we belong to.” The clip of AOC is no different; over and over, it was re-posted with an air of: “Those people are wrong, and we’re right.”

When it comes to climate change, two teams—liberals and conservatives—have taken strong positions. And what’s shared online serves to justify this—and not to uncover the truth.

The truth has a lot working against it, so it’s not a fair fight. More on this in Part 3.


Part 3

Algorithms, Righteousness, Truth, Addiction, and Rabbit Holes

What do you think?

Thoughts, ideas, and suggestions welcome.