We are living in what some call an era of unprecedented global information flows. Participatory online communication technologies such as social media have allowed anyone with access to the internet to upload information for anyone else can see. Though not everyone participates as an active prosumer of information, enough people do that we are overwhelmed with information. 300 videos are uploaded to YouTube every minute, five new Facebook profiles are created every second, and every second about 6,000 tweets are posted to Twitter. The numbers are mind boggling.
In this environment of information overwhelm, a framework of media ecology can help us to explain current tendencies toward misinformation, disinformation and increased polarization. Specifically, some ideas first introduced by Neil Postman in his books, Technopoly and Amusing Ourselves to Death, help us to understand our current epoch.
Postman comes from a school of thought known as media ecology. Media ecology takes as its central premise the idea that the way we communicate, and the forms and platforms which structure that communication, plays a central and important role in human culture and society. For Postman, this means that the dominant forms of communication technology (in his time television and computers, but now computers, the internet and social media) will shape the information that people interact with and in turn shape how we perceive ourselves, our role in the world and our interactions with one another.
In Technopoly, Postman critqued the rise of “context-free information”. Context-free information is information that is directed at no-one in particular. In our current information environment, we can think of participatory media as enabling the proliferation of context-free information. As a result of this type of information, Postman suggested that traditional filters no longer work, and thus in a world of context-free information people are somewhat overwhelmed. As a result, people are more likely to accept automated solutions. In this case, algorithmic filtering and datafication. When this happens, computers begin to undermine other ways of knowing and remembering information, leaving us less likely to be able to engage in the critical thinking necessary to make sense of the information around us.
In Amusing Ourselves to Death, Postman showed how the popular media of the day shape discourse. Our popular media, social media, is shaped fully by an entertainment frame. It needs clicks to survive, so content that is engineered for social media is engineered to be as compelling as possible, with high levels of intrigue, emotionality and celebrity. Postman worried that an entertainment frame would create a sort of Huxleyan Brave New World, in which we are so busy distracting ourselves, we willingly submit to oppression.
I think that both of these things are happening. There is just so much context-free information available that people have a hard time figuring out exactly what they should pay attention to. At the same time, our lizard brains are driven and rewarded by the dopamine hits of emotional, compelling or scandalous content. We amuse ourselves to death because we are overwhelmed with information that we don’t know how (or lack the capacity) to process.
So what can we learn here. Postman’s critique is instructive, but a critique without a way forward is just hand-wringing. I think the main lesson here is this. Participatory content creation is not in and of itself inherently democratic, and can actually lead to challenges for democratic communication if we cannot figure out how to best filter it. The current filters applied by social media companies – that is to say filters that prioritize clicks, don’t work in the best interests of human society because important information is not always (or often) compelling enough to spread online. Therefore, we need new, and I would argue transparent, filtering practices, if we are going to maintain something resembling democratic communication in an information age.
As technologies develop and problems such as deep fakes, greater content personalization, and personal bots continue to grow, these problems will only get worse. Therefore it is in our best interests of society to demand more from our technologies, and decide ourselves how we think we should access and filter online information.