Angering Ourselves To Death – Postman’s Brave New World Re-Re-Visited - Chapter 2

Chapter II: The Media Malware Machine

Norbert Weiner.

In 1994, media theorist and documentarian Douglas Rushkoff said the media is ‘the extension of a living organism; [media and communication technology] is a circulatory system for today’s information, ideas and images.’

The media as an environment in 1994, a decade on from 1984, seems quaint by today’s comparison.

In the average Western household, you could likely find a television set, a radio, a telephone, and of course, paper, pens, and envelopes to send letters. Many households likely had a subscription to print media; newspapers, periodicals, magazines.

If you considered yourself “tech savvy” you may have had a computer of some kind. An IBM-PC compatible, a Commodore Amiga, Atari ST, Macintosh, or even an Acorn RISC PC.  If you worked in the C-suite, you may have had a laptop or a cellular portable telephone. If you were on the geeky side, it’s possible you used dial-up modems to call into Bulletin Board Systems (BBS) to trade files, or even have a connection to Internet (as it was referred to back then.) Internet had a scarce handful of web pages; the World Wide Web Consortium (W3C) was only formed during this year.  Most users connected through portals such as CompuServe, Prodigy, or AOL (in the US) to interact with Usenet newsgroups or read their email.

In today’s world, we have only one standard computer processor architecture (the x86-64) which is found in Windows, Linux, and Mac based PCs and laptops. All web traffic is routed through Domain Name Servers, as standardised by the Internet Corporation for Assigned Names and Numbers (ICANN) which operated the Internet Assigned Numbers Authority (IANA). IANA was overseen by the United States Department of Commerce as recently as 2016. In 1994, the circulatory system had many hearts.

Today we have one.

Content is the blood pumped around by that heart, transmitted and duplicated at frightening speed. In 1994, there existed 2,738 websites. If every tweet is counted as a webpage (which it very well could be, considering each tweet is given its own URL), over 6,000 webpages are created each second.

Each of those websites likely had their own web server. A beige box sitting under a desk, usually in a university physics or computer science department. In the time you read that sentence, about 24,000 more webpages exist, perhaps with images, audio, or video content attached. To enable the transmission of all this content, the internet “backbone” is its servers.

According to the Synergy Research Group, Amazon Web Services controls one-third of the world’s public cloud computing capacity, more than Microsoft, IBM, and Google combined. Formed in 1995 as a book retailer, Amazon.com has exploded into a global conglomerate that not only sells goods online but facilitates the sale of said goods by providing the infrastructure they’re sold on. In 2016, Amazon.com captured $1 out of every $2 spent on retail goods online. Amazon has a vested interest in facilitating content to create more opportunity to sell its products. In 2013, Jeff Bezos, owner of Amazon, bought the Washington Post for $250 million.

As mentioned earlier, the volume of information we can encounter is infinitesimal, and growing exponentially. If we position content being distinct from information, that is, data we can interpret and use in some meaningful way, the web-enabled media environment has not been “hijacked” by fake news or disinformation as many thinkpieces and hot-takers point out.

The current media ecology a simply enables larger and larger quantities of this content to be created, shared, and consumed. In the words of German renaissance philosopher Paracelsus, “sola dosis facit venenum” or in English, the dose makes the poison.

His thesis was that all substances, at high enough dosages, are poison. One can drown if they consume too much water. Once can also be harmed if they breathe molecular oxygen at increased partial pressures. With no map to guide us on what content is “useful” and what is not, we succumb to overwhelming content overload. The well itself is not poisoned, but the amount one can drink from it will inevitably make you sick.

We now have untold media power, both as consumers and as potential producers. Anyone can connect with virtually anyone else across the globe in real-time. But for everything we gain from new media technologies, we also lose something.

It would seem the side-effect of this is polarisation, or a binary, two-valued orientation. This is not the cyberneticist Norbert Weiner’s sincere wish of using technology to enhance human beings for human use; it is technology shaping our conception of reality, and perhaps even our consciousness, to better fit machine algorithms. Rushkoff touched on this nearly a decade ago in his “contract” with technopoly – we either program or get programmed ourselves.

Now in the electronic media world, linear narratives are unimportant; we can tune into a tweet, watch three different YouTube clips at a time and use RSS readers to aggregate thousands of articles, picking and choosing the few that are worthy of our dithering attention. 15% of the world’s internet traffic is dedicated to streaming content from Netflix.

In any cybernetic system, as Weiner defined it, information control is integral to the health of that system. In his treatise Human Use of Human Beings, he describes a power station where the flow of information between man and machine flows in both ways, perhaps as status reports and commands as feedback in response to those status reports. However, a communication system requires a filter on erroneous or non-useful information. The filter was obvious in Postman’s time – TV news editors, newspaper editors, etc. The prevailing critique was that said editors may impose biases based on political or corporate diktats, restricting the “authenticity” of what was being presented.

The inherent problem with the overabundance of content is filters at the source (the publisher) yet no filter at the consumption level (the readers.) The messages are not filtered, however the information contained within the messages are. The media malware is inherent due to the lack of filtering for bogus or “fake news” at the consumption level. Is there a non-invasive filter for such information, that does not rely on Orwellian tactics? As Postman once said, information for the sake of information may not be useful; there is no value to knowing Princess Adelaide had the whooping cough.

Publishers can publish skewed perspectives, lie by omission, or flat-out create fictions which consumers may or may not have the inclination to root out. The ultimate filter would be to deny these publishers ad revenue and have them go out of business. Between January-February 2019, sites such as BuzzFeed, VICE and Huffington Post (HuffPost) laid off approximately 2,200 journalists in the US, with BuzzFeed Australia signalling similar cutbacks to its editorial division. Perhaps the poisoned adrenaline pumping through the media circulatory system will wane – the market, it seems, has decided.

Content filtering is draconian and ought to be given the disdain it deserves. What our media environment lacks is a transparent, decentralised, and self-correcting information filter; one that is unbiased, impartial, and robust enough to counteract the terror of disinformation. Aggregation, or positioning oneself on a higher level of abstraction, such as those found in apps like NewsVoice, may be a possible fix. It presents a filter for the disinformation crisis. However, it does not solve the content crisis; and at present, there may be no logical solution. The cybernetic feedback goes both ways but seems to benefit the system and not the user.

It would seem the information system on which we have come to rely is programming us, and we're being injected with malicious binary code.