In a world of 24/7 cable news, round the clock Amazon Deliveries, and infinite social media notifications, it’s been difficult to not notice how the ways we interact with each other and information is changing each passing day. Chief among these changes of course is the problem of fake news (#fakenews, I guess) or the spread of misinformation in an online environment. Some people blame Facebook and Twitter, who are currently in the news for the role their platforms played in Russian trolling of the 2016 US election. Their involvement notwithstanding, it seems to me that though the platforms had a role to play in the spread of fake news, it was more a passive role in which they took advantage of human frailty, rather than an active one in which their algorithms spread fake news because it’s more exciting content or something. Allow me to explain.
Human beings are, as we know, social animals. As such, we want to be part of an in group. Without this, early humans would have died. In fact, studies suggest that loneliness can still kill you, even without the presence of hungry predators. Since we need our groups to survive, we thus adjust our behaviors to accommodate the preferences of the groups that surround us. We’re also notorious for creating mental shortcuts for things. This had an evolutionary advantage too, originally. Again, if there’s a hungry tiger jumping out from behind a tree to try to eat you, you don’t want to have to think too hard about whether to run away. We create mental shortcuts because thinking is a resource heavy endeavor, and it’s much more advantageous to make decisions efficiently.
These two properties of evolutionary biology that have allowed us to do so well, have also now, made it possible for fake news to spread and spread widely. Group identification and mental shortcuts are two key reasons why if we see information that corresponds with something that we already feel to be true, we’ll be more likely to share with others who will in turn pass it on, without fact checking the information. Unfortunately for democratic participation, this is especially the case for the type of political information that Russian trolls were able to spread so handily.

Marshall McLuhan warned of a global village in which information was available instantaneously from around the world. He said that this type of information environment would create tribes of people who would come into conflict with one another. Though he didn’t specifically mention the spread of misinformation, he did highlight how too much information makes it more difficult for people to see eye to eye.
McLuhan spoke about these trends before the rise of the popular internet and social media, but he did see, even in the trends of radio and television the rise of identity politics as a discourse and its associated tribalization. Now we can also see how these things contribute to the spread of fake news. If I identify with a group, I’ll trust that group and be more likely to share information that affirms the group. The mental shortcut I’m making is that if something feels true, then it is true. The Facebook and Twitter (and also Google) algorithms take advantage of this tendency by favoring content that I’m more likely to share. In other words, if I like or share something, I get to see more of it, further cementing my in-group identification and confirming to me that my mental shortcut is correct. Walter Ong wrote that electronic media return us to a type of oral culture. I would go even further and say that social media work directly with our “caveman” brains. This is why they’re so good at spreading misinformation and so hard to give up.