My Apology to Starhunter Redux

 

Michael Paré as Dante Montana.

I owe low-budget sci-fi series Starhunter Redux, which I've been watching on Amazon Prime, an apology.

I'm sorry for calling you "total ass", that your sets look like total amateur shit, and for saying you're worse than Star Trek: Voyager on a bad day. You are none of those things.

Well, maybe a few of those things. The obscure, 44-episode series was definitely shot on a shoestring. Not a "Doctor Who in the 70s" tin-foil and cellophane budget, but pretty close. This "Redux" version released in 2017 updates the CGI shots and visuals...I can only imagine what VideoToaster monstrosity came before it.

As for the show itself, it's fairly simple as sci-fi premises go. It's the future, 2275. A rag tag motley crew of bounty hunters traipse frontier star systems hunting escaped criminals. They fly about in their retrofitted cruise liner, the "Tulip." It's crewed by a stoic, cowboy looking, anti-heroic captain (Michael Paré as Dante Montana); an impulsive yet tortured ex-soldier first mate (Claudette Roche as Lucretia Scott); and Dante's adopted niece (Tanya Allen as Percy Montana) serving as chief engineer, comic relief, and a child-like naive foil to the hard-boiled command crew. Oh, and there's a floating AI called Caravaggio (Murray Melvin), who's a cross between Batman's Alfred and the Mother on the Nostromo (Alien). I wish I was kidding.

Spoilers ahead.

Having a look through Amazon Prime, I thought I'd give it a go. I fucking loved Space Precinct 2040, another streaming hidden gem.

Oh boy. It did not start well.

The CGI was something out of a student project; sets held together with bubblegum and balsa wood. Red Dwarf could have pulled this off...but it was supposed to be funny. Then the story began.

I was laughing my ass off in the first episode; a leader of the shadowy organisation "The Orchard" kept referring to "genes" over and over. Eventually my mind shut down and I started thinking of denim jeans... it was all downhill from there.

However it revealed a crucial piece of lore: The Orchard is tasked with sequencing the last unknown parts of the human genome, unlocking humanity's elevation to a higher plane of existence; "The Divinity Cluster." There's a bit of shakiness to the stories, but its rather well executed science fiction nonetheless.

An entire back story to Lucretia as a soldier details her liberating a cruel medical experimentation facility on Callisto. The experiments, conducted by a Dr. Mengele style character, sparked a genocide of "pure" humans against cybernetic or genetic "augments"; the story was particularly heartbreaking, especially when she confronts the Mengele character head on.

All this sounds like they're ripping off Cowboy Bebop, Firefly, and Deus Ex: Mankind Divided, except these predate both by a number of years (Starhunter debuted in 2000: so two in the case of Firefly.) Even so, similarities to these and other shows definitely don't end there.

The series thus far has a Doctor Who vibe; adventurers unbound by a "hero's code" or Prime Directive. They're thrust into solving other people's problems while trying to make a quick buck. They're usually navigating some gnarly moral grey areas and duking it out in some bargain-basement action sequences.

Oh yeah, it's done on the very cheap. Which is fine - science fiction can be as cheap as it wants and still be good science fiction.

Cheap Science Fiction is Still Science Fiction

Science fiction as a visual medium requires so much of the viewer's intellectual attention and imagination to fill in the gaps. Yes friends, the Doctor's TARDIS - a blue police box - is bigger on the inside. It also goes anywhere in time and space. If it looks shoddy and made out of plastic, who cares? The fact that it exists at all requires an extreme suspension of disbelief in the first place.

The better the visuals, the more the imagination suffers. The shitty cave in Doctor Who serial The Pirate Planet, in which Tom Baker's Doctor paces back and forth because they've only got about three feet of green screen to work with, would look like complete shit if it were a crime series or adventure film (Just go to a regular cave??). The fact they're walking on a destroyed surface of a planet sandwiched between a planet crushing spaceship makes it look feel like it was plucked whole from Douglas Adams' mind and committed to film. It's real enough to tell the story; that's what matters. Those ofay with sci-fi know the score.

My beloved Babylon 5 looks like shit. Really, really bad. It'll only look worse as time goes on and production values go up. If Babylon 5 was a book reliant on imagination alone, it's would be one of the most compelling and and wonderous science fiction books ever written. It's outstanding, especially considering creator and writer J. Michael Straczynski went through an ordeal to realise his vision at all.

In modern storytelling, we can conjure near-real computer-generated, well, anything you can think of. We don't really have to imagine an "eldritch terror" like a Cthulhu or a Godzilla, it's there. We made one. See? The less imagination goes in, the less imagination we invite from the viewer. It's no wonder so many people are turned off by special effects-driven science fiction like the latest Star Wars or Marvel films.

What really killed the sci-fi imagination-factor for me was Star Trek: Discovery (in more ways than one). In one episode, hundreds of repair droids started fixing the Enterprise's phaser-scarred hull mid-battle. Commander Captain Pike ordered "damage control" - and we saw damage being controlled. In minute detail.

I would have been satisfied if he just said the line and moved on. (Hopefully as rocks fell around him and sparks flew.) Every other Star Trek incarnation had a captain or flag officer barking orders for "damage control." In our minds, the damage is being controlled. It's the 23rd/24th Century, of course it's being controlled. They have a device or a computer program called "damage control" and its function is to control damage. The end.

Please accept my apology, Starhunter Redux. I lost sight of science fiction being an intellectual, imaginitive exercise more so than a visual feast handed to me on a platter. It's the writer's job to strike the right balance between what's believable and how much we can "imagine" the story up ourselves. In this case, I think they'd nailed it. For the most part.

Some of your story elements were a bit hokey, but your ideas feel rich and engrossing. I'm only part way through, so it could fall over like the second season of Andromeda or something. I hope not.

I'll promise not to judge a TV series by the quality of its sets and CGI in future. Maybe.

Angering Ourselves To Death – Postman’s Brave New World Re-Re-Visited - Chapter 4

Chapter IV: Analog People, Binary Computers

Composite image of Abraham Lincoln and Stephen Douglas.According to polling and analytics firm Nielsen in their 2018 Total Audience Report, adults over the age of eighteen in the United States use electronic devices for over ten and a half hours a day. More time is spent using electronics than sleeping. Use of information communications technology – computers, smartphones, TVs, tablets, etc. – is an inescapable part of modern life. The widespread adoption of the Internet in the late 1990s was as revolutionary to civilisation as the mechanical clock or the printing press. There is no going back to a state prior to ICT – it was not an additive invention; it is a transformative one.

In Postman’s and French sociologist Jacques Ellul’s view, the bargain we make with new technology is Faustian; what problems new technology solves seems to take something away at the same time. In the late 1990s and early 2000s, some magazines and TV programs seemed to pine for a pre-ICT era of face-to-face interaction and community; while the counter-argument was found in widespread adoption of Usenet groups, BBS, forums, and nascent forms of social media such as LiveJournal, Xanga, and in the latter part of the 2000s, MySpace, Twitter, and Facebook.

As discussed in the previous three chapters, the downsides to our uncritical flocking to smartphones and mass communication media has created one of paranoia, distrust, and mass surveillance. Each day, hour, minute, we’re inundated with terabytes of information and are standing at crossroads every time we scroll a screen – do you like this, or don’t you?

Weiner’s conception of man using machines to improve man has been turned on its head; we’re now using machines to gain feedback from man to improve the machines. The almighty algorithm processes data in binary terms – PageRank for Google; News Feed for Facebook. In effect, it has transformed our consciousness of how we interpret and respond to information. The new mass media has purposely excluded the middle since its entire architecture has erased it entirely. There is nothing in between 0 and 1 in binary – it either is, or it is not.

Amid the heightened emotion and shock of the September 11, 2001 Terrorist Attacks on the United States, President George W. Bush addressed the nation, saying "Our grief has turned to anger, and anger to resolution. Whether we bring our enemies to justice, or bring justice to our enemies, justice will be done…Every nation, in every region, now has a decision to make. Either you are with us, or you are with the terrorists.”

The post-Cold War era of a “multipolar” or “unipolar” world order had come to an end. At the dawn of the binary computer came binary politics and binary culture. There is no nuance. There will be no negotiation, no terms. You’re either with us, or you’re against us.

The Aristotelian law of excluded middle is not new; this was used to great effect in totalitarian dictatorships across time. Nazi Germany and Soviet Russia being the prime examples. All things deemed acceptable were “Aryan” or “Bolshevik,” while whatever was outside this purview was “degenerate,” “Jewish,” or “Bourgeois.” Like 1984, this required doublethink to pull off with any degree of success; for example, Germany’s military pact with Japan required Nazi lawmakers to accede that the Japanese people were “honourary Aryans” to match their binary worldview.

Much of today’s “thinkpieces” by the identitarian Left and Right seem to ascribe to a two-valued orientation – either you are on our side, or you are “Nazis”, “white supremacists,” or “Socialists.” These tags are used in wild abandon – they were applied to moderate Republican Mitt Romney in the 2012 Presidential Election, resurrected in 2016 during the ascendancy of Donald J. Trump’s candidacy. The Trump candidacy could be the first to use the two-valued orientation to its advantage – either you wanted to Make America Great Again, or you wanted America to fail. This identitarian trope backfired for the Hillary Clinton campaign, calling “half” of Trump’s supporters “a basket of deplorables” on September 9, 2016 – almost 15 years to the day President Bush made his rallying speech:

“They're racist, sexist, homophobic, xenophobic – Islamophobic – you name it. And unfortunately, there are people like that. And he has lifted them up. He has given voice to their websites that used to only have 11,000 people – now have 11 million. He tweets and retweets their offensive hateful mean-spirited rhetoric. Now, some of those folks – they are irredeemable, but thankfully, they are not America.”

Clinton’s two-valued orientation is clear – you’re either With Hillary as an American, or an irredeemable “deplorable.”

The ensuing tribalism is a feature (not a bug) of our artificial-intelligence driven “siloing” of people among racial, gender, political, etc. lines. Where as mass market advertising broadcast messages such as buying jeans to “look sexy,” only a sub-section of a sub-section of a population may have been receptive to the message. To the rest, it passed undetected. By giving up our preferences and dislikes to an algorithm, it has sorted us into self-selecting groups that are binary in nature: you are either for “spicy memes,” or you are a “soyboy cuck” or “white supremacist patriarch” to use two opposing invectives.

Rushkoff calls this AI driven, agenda-adopted polarisation of civil society an “exploit” of our human emotions. That these memes and narratives are designed to trigger our amygdala brains into fight or flight, kill or be killed – bypassing our rationality and logic as granted in the neo-cortex. Postman in his Amusing Ourselves To Death remarked that we, in 1985, were living in a soundbite culture; where a complex idea or nuanced debate was chopped up into bite-size, three to four second clips. This is distilled even further into repeating GIF images, image macros, screenshots of tweets, and other internet errata. The algorithm serves to reinforce long-held viewpoints, shut out debate, exclude the middle. Worse still, it is learning new methods of exploiting us, as Facebook's AI learned to speak its own created language in 2017.

Even search engines are engineered to cater to our biases. For example, if one searches for the “benefits of stevia,” one may never see results for “drawbacks of stevia” unless one searches for terms that criticise stevia as an artificial sweetener. There is money to be made spruiking both sides of the argument. Harmless in debates of taste, but not of fact. This can lead to disaster such as the January 2019 measles outbreak in the continental United States, partly blamed on pockets of unvaccinated people.

The binary orientation does not require context, nor does it require extraordinary evidence to back up its often extraordinary claims. To the identitarian left, the Muller investigation into President Trump’s collusion with Russian agents during his presidential campaign is a “smoking gun” or “did not go far enough.” To the identitarian right, the President was subject to a “witch hunt” and “dirty politics” by his opponents. The excluded middle may not be the factual truth, but it may lead us to further questions and increased (though not total) accuracy in reporting and coming to a conclusion.

In April 2019, podcaster and entertainer Joe Rogan of the Joe Rogan Experience invited Twitter CEO Jack Dorsey and Twitter Legal, Policy and Trust & Safety Lead Vijaya Gadde to debate journalist and YouTuber Tim Pool as to whether Twitter’s policy on free speech is biased against conservative and libertarian voices. Prominent right-leaning personalities such as Alex Jones and Milo Yiannopoulos were banned from the platform, for example.

The entire episode runs for three hours and twenty-five minutes; longer than the 1858 “great debates” between Republican Abraham Lincoln and Democrat opponent Stephen Douglas. Of course, the worlds both podcast and oratory inhabited are alien to one another. Commentators chopped up Rogan’s podcast into soundbites, often decontextualised to fit identitarian, binary narratives. Setting aside three hours of screen time to focus on one debate is folly, considering the sheer amount of content on offer at any given time.

The computers are using us to profit from us; with the soil fertile for commodifying dissent, how does one cash in using the modern mass surveillance, binary-oriented media ecology?

 Next Chapter: Angry Reacts Only – Harvesting Cash from the Media Ecology

Angering Ourselves To Death – Postman’s Brave New World Re-Re-Visited - Chapter 3

Chapter III: The Medium Is The Mass Surveillance

 An Amazon Alexa-enabled device.

In March 2018, a whistleblower told Observer newspapers that UK-based political consulting firm Cambridge Analytica had harvested over 50 million Facebook profiles in a breach of data and privacy. Christopher Wylie who worked with an academic at Cambridge University to gather the data said “We exploited Facebook to harvest millions of people’s profiles,” Wylie told the Observer. “[We] built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.”

The data was collected through an app called thisisyourdigitallife, posing as an online personality test. Exploiting various weaknesses in Facebook’s application programming interface (API), it collected profile information not only from those who authorised the app, but their friends and their friend’s friends.

The information was used to target American users during the 2016 United States Presidential Election and the 2016 UK Referendum on the question of remaining or leaving the European Union.

At least Cambridge Analytica had the courtesy to allow users to opt in. The invasive XKeyscore and Boundless Informant programs used by the NSA to collect signals intelligence and conduct mass surveillance on US and foreign citizens afforded users no such luxury.

As mentioned earlier, Facebook and other social media do not sell products or services directly but facilitate a platform for marketers and advertisers to do so. The well-worn aphorism “the product they are selling is you” is a misnomer. If we take the semanticist Korzybski’s maxim to heart – the word is not the thing – they are not selling you specifically, but a 1 to 1 simulacrum that extends beyond your own consciousness. Human consciousness is also tempered by human unconsciousness; we forget, misplace information, and have moments of complete unawareness of our own behaviours.

Computers don’t.

A computer has perfect memory, perfect algorithms, perfect recall. It can know you better than you know yourself. Thus we, as humans, employ computers to learn more about our habits, wishes, frustrations, and desires. This is not ill or good in and of itself but can be used by humans in either fashion.

If we are also being programmed, we are also being labelled, sorted, objectified, and tabulated. Facebook and its ilk have shifted human consciousness into accepting computers as a wholesale extension of our senses. For instance, it has become acceptable for activists to comb through large data sets such as Twitter feeds for politically incorrect comments; In December 2018, US entertainer and comedian Kevin Hart was ousted from his position as host of the 2019 Academy Awards due to making anti-gay slurs on his Twitter between 2009 and 2011.

Where as a human being may struggle to remember specific comments uttered by anyone almost a decade prior, computers enhance our collective memories by providing a library of instant storage and retrieval of anything and everything we have said or posted online. This leads to another aphorism: this is a feature, not a bug.

It is possible that the original founders of Facebook, Mark Zuckerberg and Eduardo Severin, had no intention to create a mass surveillance medium the likes of which the world has never seen. According to after-the-fact reports, Zuckerberg created “FaceMash” in his Harvard University dorm room as an application to rate the relative attractiveness of girls on campus in 2003. It was later renamed “the Facebook” then simply “Facebook” in 2006. The original app was limited to colleges in Boston, then expanded to all university-level institutions, and eventually, all people with a valid email address (and over the age of 13) in September 2006.

Facebook exploited our desire for convenience and want for human interaction. People could add “friends” to their Facebook and share their opinions, photos, videos, and other content with one another. They could also join in on games. They could express their desires by an “opt-in” – the Facebook “like” button. “Liking” topics or webpages built up a profile of your preferences and interests; albeit manually. As of 2019, this is achieved via machine learning and artificial intelligence.

Facebook acquired photo sharing app Instagram in 2012; instant message service WhatsApp in 2014. It launched its own proprietary messaging platform, Messenger, in 2015. According to the End User Licence Agreements, Facebook could use these applications on your phone to harvest data about your habits, including your location. In 2016, Facebook strenuously denied eavesdropping on conversations, using one’s smartphone camera or microphone to pick up vision or audio. Facebook has spent millions of dollars on PR to counteract these claims, saying that advertising that pops up in feeds is a result of “frequency bias” or just plain coincidence.

It was confirmed in April 2019 by Bloomberg that human technicians in the employ of Amazon listen to voice searches and other audio picked up from Alexa-enabled devices. This mix of contractors and employees based around the world are tasked with refining the voice search algorithm to produce better results. However, the nature of the medium is to have an “ear” out for keywords and phrases at all times. According to the article,

“Sometimes they [employees] hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress. Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.”

The media we consume and produce for is the mass surveillance; the Faustian bargain we’ve made with technology is coming back to haunt us in myriad ways. Mass surveillance by private entities is chilling enough; however, the panopticon effect of moral busybodies and invective-slinging do-gooders has also cost people their livelihoods. This public shaming by internet mob was made most famous in 2013, when corporate communication director Justine Sacco tweeted just as she departed for Cape Town: “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!” A joke in poor taste; it nevertheless was the Number 1 worldwide trending topic on Twitter, creating a storm of controversy before Ms. Sacco had even stepped off the plane.

Because of these interconnections both public and private, the mass surveillance nature of media is inescapable. The mass surveillance is having a profound effect on the way we parse language and the meaning of that language; and breaks down the tacit disconnect between language as action and language as thought in action. Nuance is impossible, tribal and identitarian sentiments are rising. We are analogue people, being programmed to think in binary ways.