Angering Ourselves To Death - Postman's Brave New World Re-Re-Visited - Chapter 5

Chapter V: Angry Reacts Only – Harvesting Cash from the Media Ecology

The "old" Facebook like button, as it appeared in 2009.

“Researchers have found, for example, that the algorithms running social media platforms tend to show pictures of ex-lovers having fun. No, users don’t want to see such images. But, through trial and error, algorithms have discovered showing pictures of our exes having fun increases our engagement. We are drawn to click on those pictures and see what our exes are up to, and we’re more likely to do it if we’re jealous they’ve found a new partner. The algorithms don’t know why it works, and they don’t care. They’re only maximise whatever metric we’ve instructed them to pursue.” – Douglas Rushkoff, Team Human

In September 1993, Global Network Navigator (GNN), now a division of O’Reilly Media, sold the first ever clickable advertisement on the World Wide Web to a law firm based in Silicon Valley. By the late 1990s, during the dot com boom, internet advertising in the form of clickable icons or Graphics Interchange Files (GIF) that flashed enticements across the screen were ubiquitous.

Advertising alongside search engine results pages (SERPs) became the standard in October of 2000, when Google launched its AdWords service. Companies and brands would pay for sponsored links that appear at the top of search results, bidding for the top spot using automated algorithms. When a browser clicked on the ad, the company paid for the privilege – what’s known as a “Pay Per Click” advertisement (PPC.)

Advertising wares and services on a PPC basis might connect customers to products they may want or need, but this was hurting the revenue of the previous arbiter of advertising and commerce – newspapers and magazines. Newspapers could no longer rely on revenue from advertisers since many of them found cheaper and more effective alternatives online.

According to the Pew Research Centre, advertising revenue topped $49 billion (US) in 2004 – it now sits at about $18 billion US for all newspapers combined in the United States. By contrast, Google’s $110.8 billion in revenue derives mostly from Google AdWords, now Google Ads.

Newspapers had to adapt to the medium but also the media ecology at large. Newspapers that migrated online – and digital only start-ups alongside them – needed to realise that providing nuanced, balanced reporting was not the way of the future. To make real money, they needed people to get search engine results. Calls to action, not calls to thought.

The people reading their content needed to be as digital as the machines that host it.

All Positions Contested

The week through August 28th, 2014, major video games publications including The Escapist, Gamasutra, Kotaku, The Daily Beast, Vice, Destructoid, and even conventional masthead The Guardian proclaimed the identity or fandom of video gaming was toxic and rooted in a culture of homophobia, sexism, and racism. Appearing like a coordinated attack on the current state of gaming culture, this led to a small yet significant backlash against the games press known as #GamerGate, a hashtag calling for a restoration of ethical standards in video games journalism.

Proponents of #GamerGate were lambasted for conducting a relentless harassment campaign against critics of gamer culture, notably feminist and left-wing critics such as independent game developer Zoe Quinn; fellow developer Brianna Wu; and Anita Sarkeesian, host, and founder of not-for-profit feminist cultural studies organisation Feminist Frequency.

At the time, feminist writer Jessica Valenti commented, "the movement's much-mocked mantra, 'It's about ethics in journalism'" was seen by others as "a natural extension of sexist harassment and the fear of female encroachment on a traditionally male space." Sarkeesian and her adherents still contend that sexist “tropes” or elements in video games contribute to real world attitudes against women; though media theorists have long debunked notions that players of video games are more predisposed to violence, call it a “moral panic” in the wake of mass shootings.

In mid-September of that year, provocateur and then-Breitbart editor Milo Yiannopoulos published discussions from a closed mailing list known as GameJournoPros, which detailed the coordinated publishing of op-eds declaring “end of gamers”, mockery of certain developers, and presented this as evidence of collusion in the gaming press.

Much like the President Bush declaration – “You are with us or you are with the terrorists,” the video games press declared you are either a toxic “gamer”, or part of the new vanguard of “woke” identitarian games consumption, which favoured left-wing ideological bias in place of “conservative” ideas. The culture war sorted itself into binary groups, choosing a digital battlefield and leading a charge to preserve or change a purely digital media ecology.

Both sides were using the medium as mass-surveillance to eke out their fronts, fire salvos in each other’s direction, and gin up support for their cause. Fitting that their respective “causes” were video games, as social media uses gamification to ensure people use it and keep using it; “rewards and incentive systems determine usage,” as Israeli-Macedonian psychologist Sam Vaknin describes it.

140 Characters Hate

Vaknin says that the truncated nature of communication in social networks is more conducive to hateful responses; it only takes a few words to express hatred, e.g., “go to hell! Fuck you!” Where as love and compassion would require many more words to convey: perhaps an order of magnitude larger, such as a letter or a “real world” gesture lest it feel insincere.

Orwell in his prescient “two minutes hate” – a spontaneous eruption of malice toward Party enemies in Oceania – may mirror this assessment. A continuous “twenty minutes” hate would lose steam before long; it could even give rise to the realisation their hate is manufactured, and they are indeed being manipulated.

Vaknin also describes that the social media/mass surveillance circuit is built on ambiguity, the fear of the other. “The only way to disambiguate something is to get to know it,” he says. Intimacy, he also says, reduces the need for addiction and dependency.

Former Facebook engineers have admitted that the medium was built around “continuous partial attention,” as Justin Rosenstein – inventor of the “like” button - described to The Guardian. Nir Eyal, author of Hooked: How to Build Habit-Forming Product says mass surveillance media exploits “feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation.”

Rolled out in February 2009, replacing the star rating and “awesome” button, the “like” button gives users a temporary and fleeting serotonin boost – the neurotransmitter responsible for reward pathways – and allows Facebook and partners to track user behaviour and preferences for later advertising and remarketing efforts. The like button is also ambiguous; does “liking” a post that describes a user’s misfortune mean they are revelling in their misery? Providing sympathy? Simply showing they have read and understood the message?

This ambiguity paired with gamification gives rise to a media ecology that conditions users to stay on the system. Ambiguity is also biased towards amygdala, “lizard brain” reactions of hatred, upset, outrage, and terror. More nuanced or reasoned content would be of no value to the medium, as it encourages neocortical, higher-brain reasoning and analysis. It also requires full, instead of partial, attention. As mentioned earlier, a sustained “twenty minutes hate” would not work as the hatred would (presumably) give way to self-reflection.

Vaknin says that suicide rates among youth in the last decade have jumped 31%, thanks to social media’s bias toward anxiety and depression-causing content. Harvesting the cash from this media ecology is achieved through constant “reward” activation and treating real human intimacy as a threat.

It’s working – as Facebook generated $55 billion in advertising revenue in 2018. That’s $37 billion more than the intimate, unambiguous, nuanced, and reasoned reporting from all US newspapers combined.

If one’s life is mediated through Facebook and other mass surveillance media, the conditioning to be harvested for cash is almost limitless.

My Apology to Starhunter Redux

 

Michael Paré as Dante Montana.

I owe low-budget sci-fi series Starhunter Redux, which I've been watching on Amazon Prime, an apology.

I'm sorry for calling you "total ass", that your sets look like total amateur shit, and for saying you're worse than Star Trek: Voyager on a bad day. You are none of those things.

Well, maybe a few of those things. The obscure, 44-episode series was definitely shot on a shoestring. Not a "Doctor Who in the 70s" tin-foil and cellophane budget, but pretty close. This "Redux" version released in 2017 updates the CGI shots and visuals...I can only imagine what VideoToaster monstrosity came before it.

As for the show itself, it's fairly simple as sci-fi premises go. It's the future, 2275. A rag tag motley crew of bounty hunters traipse frontier star systems hunting escaped criminals. They fly about in their retrofitted cruise liner, the "Tulip." It's crewed by a stoic, cowboy looking, anti-heroic captain (Michael Paré as Dante Montana); an impulsive yet tortured ex-soldier first mate (Claudette Roche as Lucretia Scott); and Dante's adopted niece (Tanya Allen as Percy Montana) serving as chief engineer, comic relief, and a child-like naive foil to the hard-boiled command crew. Oh, and there's a floating AI called Caravaggio (Murray Melvin), who's a cross between Batman's Alfred and the Mother on the Nostromo (Alien). I wish I was kidding.

Spoilers ahead.

Having a look through Amazon Prime, I thought I'd give it a go. I fucking loved Space Precinct 2040, another streaming hidden gem.

Oh boy. It did not start well.

The CGI was something out of a student project; sets held together with bubblegum and balsa wood. Red Dwarf could have pulled this off...but it was supposed to be funny. Then the story began.

I was laughing my ass off in the first episode; a leader of the shadowy organisation "The Orchard" kept referring to "genes" over and over. Eventually my mind shut down and I started thinking of denim jeans... it was all downhill from there.

However it revealed a crucial piece of lore: The Orchard is tasked with sequencing the last unknown parts of the human genome, unlocking humanity's elevation to a higher plane of existence; "The Divinity Cluster." There's a bit of shakiness to the stories, but its rather well executed science fiction nonetheless.

An entire back story to Lucretia as a soldier details her liberating a cruel medical experimentation facility on Callisto. The experiments, conducted by a Dr. Mengele style character, sparked a genocide of "pure" humans against cybernetic or genetic "augments"; the story was particularly heartbreaking, especially when she confronts the Mengele character head on.

All this sounds like they're ripping off Cowboy Bebop, Firefly, and Deus Ex: Mankind Divided, except these predate both by a number of years (Starhunter debuted in 2000: so two in the case of Firefly.) Even so, similarities to these and other shows definitely don't end there.

The series thus far has a Doctor Who vibe; adventurers unbound by a "hero's code" or Prime Directive. They're thrust into solving other people's problems while trying to make a quick buck. They're usually navigating some gnarly moral grey areas and duking it out in some bargain-basement action sequences.

Oh yeah, it's done on the very cheap. Which is fine - science fiction can be as cheap as it wants and still be good science fiction.

Cheap Science Fiction is Still Science Fiction

Science fiction as a visual medium requires so much of the viewer's intellectual attention and imagination to fill in the gaps. Yes friends, the Doctor's TARDIS - a blue police box - is bigger on the inside. It also goes anywhere in time and space. If it looks shoddy and made out of plastic, who cares? The fact that it exists at all requires an extreme suspension of disbelief in the first place.

The better the visuals, the more the imagination suffers. The shitty cave in Doctor Who serial The Pirate Planet, in which Tom Baker's Doctor paces back and forth because they've only got about three feet of green screen to work with, would look like complete shit if it were a crime series or adventure film (Just go to a regular cave??). The fact they're walking on a destroyed surface of a planet sandwiched between a planet crushing spaceship makes it look feel like it was plucked whole from Douglas Adams' mind and committed to film. It's real enough to tell the story; that's what matters. Those ofay with sci-fi know the score.

My beloved Babylon 5 looks like shit. Really, really bad. It'll only look worse as time goes on and production values go up. If Babylon 5 was a book reliant on imagination alone, it's would be one of the most compelling and and wonderous science fiction books ever written. It's outstanding, especially considering creator and writer J. Michael Straczynski went through an ordeal to realise his vision at all.

In modern storytelling, we can conjure near-real computer-generated, well, anything you can think of. We don't really have to imagine an "eldritch terror" like a Cthulhu or a Godzilla, it's there. We made one. See? The less imagination goes in, the less imagination we invite from the viewer. It's no wonder so many people are turned off by special effects-driven science fiction like the latest Star Wars or Marvel films.

What really killed the sci-fi imagination-factor for me was Star Trek: Discovery (in more ways than one). In one episode, hundreds of repair droids started fixing the Enterprise's phaser-scarred hull mid-battle. Commander Captain Pike ordered "damage control" - and we saw damage being controlled. In minute detail.

I would have been satisfied if he just said the line and moved on. (Hopefully as rocks fell around him and sparks flew.) Every other Star Trek incarnation had a captain or flag officer barking orders for "damage control." In our minds, the damage is being controlled. It's the 23rd/24th Century, of course it's being controlled. They have a device or a computer program called "damage control" and its function is to control damage. The end.

Please accept my apology, Starhunter Redux. I lost sight of science fiction being an intellectual, imaginitive exercise more so than a visual feast handed to me on a platter. It's the writer's job to strike the right balance between what's believable and how much we can "imagine" the story up ourselves. In this case, I think they'd nailed it. For the most part.

Some of your story elements were a bit hokey, but your ideas feel rich and engrossing. I'm only part way through, so it could fall over like the second season of Andromeda or something. I hope not.

I'll promise not to judge a TV series by the quality of its sets and CGI in future. Maybe.

Angering Ourselves To Death – Postman’s Brave New World Re-Re-Visited - Chapter 4

Chapter IV: Analog People, Binary Computers

Composite image of Abraham Lincoln and Stephen Douglas.According to polling and analytics firm Nielsen in their 2018 Total Audience Report, adults over the age of eighteen in the United States use electronic devices for over ten and a half hours a day. More time is spent using electronics than sleeping. Use of information communications technology – computers, smartphones, TVs, tablets, etc. – is an inescapable part of modern life. The widespread adoption of the Internet in the late 1990s was as revolutionary to civilisation as the mechanical clock or the printing press. There is no going back to a state prior to ICT – it was not an additive invention; it is a transformative one.

In Postman’s and French sociologist Jacques Ellul’s view, the bargain we make with new technology is Faustian; what problems new technology solves seems to take something away at the same time. In the late 1990s and early 2000s, some magazines and TV programs seemed to pine for a pre-ICT era of face-to-face interaction and community; while the counter-argument was found in widespread adoption of Usenet groups, BBS, forums, and nascent forms of social media such as LiveJournal, Xanga, and in the latter part of the 2000s, MySpace, Twitter, and Facebook.

As discussed in the previous three chapters, the downsides to our uncritical flocking to smartphones and mass communication media has created one of paranoia, distrust, and mass surveillance. Each day, hour, minute, we’re inundated with terabytes of information and are standing at crossroads every time we scroll a screen – do you like this, or don’t you?

Weiner’s conception of man using machines to improve man has been turned on its head; we’re now using machines to gain feedback from man to improve the machines. The almighty algorithm processes data in binary terms – PageRank for Google; News Feed for Facebook. In effect, it has transformed our consciousness of how we interpret and respond to information. The new mass media has purposely excluded the middle since its entire architecture has erased it entirely. There is nothing in between 0 and 1 in binary – it either is, or it is not.

Amid the heightened emotion and shock of the September 11, 2001 Terrorist Attacks on the United States, President George W. Bush addressed the nation, saying "Our grief has turned to anger, and anger to resolution. Whether we bring our enemies to justice, or bring justice to our enemies, justice will be done…Every nation, in every region, now has a decision to make. Either you are with us, or you are with the terrorists.”

The post-Cold War era of a “multipolar” or “unipolar” world order had come to an end. At the dawn of the binary computer came binary politics and binary culture. There is no nuance. There will be no negotiation, no terms. You’re either with us, or you’re against us.

The Aristotelian law of excluded middle is not new; this was used to great effect in totalitarian dictatorships across time. Nazi Germany and Soviet Russia being the prime examples. All things deemed acceptable were “Aryan” or “Bolshevik,” while whatever was outside this purview was “degenerate,” “Jewish,” or “Bourgeois.” Like 1984, this required doublethink to pull off with any degree of success; for example, Germany’s military pact with Japan required Nazi lawmakers to accede that the Japanese people were “honourary Aryans” to match their binary worldview.

Much of today’s “thinkpieces” by the identitarian Left and Right seem to ascribe to a two-valued orientation – either you are on our side, or you are “Nazis”, “white supremacists,” or “Socialists.” These tags are used in wild abandon – they were applied to moderate Republican Mitt Romney in the 2012 Presidential Election, resurrected in 2016 during the ascendancy of Donald J. Trump’s candidacy. The Trump candidacy could be the first to use the two-valued orientation to its advantage – either you wanted to Make America Great Again, or you wanted America to fail. This identitarian trope backfired for the Hillary Clinton campaign, calling “half” of Trump’s supporters “a basket of deplorables” on September 9, 2016 – almost 15 years to the day President Bush made his rallying speech:

“They're racist, sexist, homophobic, xenophobic – Islamophobic – you name it. And unfortunately, there are people like that. And he has lifted them up. He has given voice to their websites that used to only have 11,000 people – now have 11 million. He tweets and retweets their offensive hateful mean-spirited rhetoric. Now, some of those folks – they are irredeemable, but thankfully, they are not America.”

Clinton’s two-valued orientation is clear – you’re either With Hillary as an American, or an irredeemable “deplorable.”

The ensuing tribalism is a feature (not a bug) of our artificial-intelligence driven “siloing” of people among racial, gender, political, etc. lines. Where as mass market advertising broadcast messages such as buying jeans to “look sexy,” only a sub-section of a sub-section of a population may have been receptive to the message. To the rest, it passed undetected. By giving up our preferences and dislikes to an algorithm, it has sorted us into self-selecting groups that are binary in nature: you are either for “spicy memes,” or you are a “soyboy cuck” or “white supremacist patriarch” to use two opposing invectives.

Rushkoff calls this AI driven, agenda-adopted polarisation of civil society an “exploit” of our human emotions. That these memes and narratives are designed to trigger our amygdala brains into fight or flight, kill or be killed – bypassing our rationality and logic as granted in the neo-cortex. Postman in his Amusing Ourselves To Death remarked that we, in 1985, were living in a soundbite culture; where a complex idea or nuanced debate was chopped up into bite-size, three to four second clips. This is distilled even further into repeating GIF images, image macros, screenshots of tweets, and other internet errata. The algorithm serves to reinforce long-held viewpoints, shut out debate, exclude the middle. Worse still, it is learning new methods of exploiting us, as Facebook's AI learned to speak its own created language in 2017.

Even search engines are engineered to cater to our biases. For example, if one searches for the “benefits of stevia,” one may never see results for “drawbacks of stevia” unless one searches for terms that criticise stevia as an artificial sweetener. There is money to be made spruiking both sides of the argument. Harmless in debates of taste, but not of fact. This can lead to disaster such as the January 2019 measles outbreak in the continental United States, partly blamed on pockets of unvaccinated people.

The binary orientation does not require context, nor does it require extraordinary evidence to back up its often extraordinary claims. To the identitarian left, the Muller investigation into President Trump’s collusion with Russian agents during his presidential campaign is a “smoking gun” or “did not go far enough.” To the identitarian right, the President was subject to a “witch hunt” and “dirty politics” by his opponents. The excluded middle may not be the factual truth, but it may lead us to further questions and increased (though not total) accuracy in reporting and coming to a conclusion.

In April 2019, podcaster and entertainer Joe Rogan of the Joe Rogan Experience invited Twitter CEO Jack Dorsey and Twitter Legal, Policy and Trust & Safety Lead Vijaya Gadde to debate journalist and YouTuber Tim Pool as to whether Twitter’s policy on free speech is biased against conservative and libertarian voices. Prominent right-leaning personalities such as Alex Jones and Milo Yiannopoulos were banned from the platform, for example.

The entire episode runs for three hours and twenty-five minutes; longer than the 1858 “great debates” between Republican Abraham Lincoln and Democrat opponent Stephen Douglas. Of course, the worlds both podcast and oratory inhabited are alien to one another. Commentators chopped up Rogan’s podcast into soundbites, often decontextualised to fit identitarian, binary narratives. Setting aside three hours of screen time to focus on one debate is folly, considering the sheer amount of content on offer at any given time.

The computers are using us to profit from us; with the soil fertile for commodifying dissent, how does one cash in using the modern mass surveillance, binary-oriented media ecology?

 Next Chapter: Angry Reacts Only – Harvesting Cash from the Media Ecology