Resorting to the clever-smarty-pants-academic titles here, I know. There’s so much you can do if you can say in two phrases what you should really only say with one. Or, I guess — not say at all.
But I said it.
Just a couple of remarks germane to the design fiction trope in this installment of the Design Fiction Chronicles — short reviews of quirky, mostly middling, sometimes good science fiction that exhibits some of the unwritten principles of design fiction.
The first film is called The Final Cut (2004, with Robin Williams, not the one with Jude Law) which Younghee mentioned while we were talking about data and death — two great dinner topics that somehow go together. It’s the story about a guy who is the editor of films that document a person’s life where the principle photography (I guess you’d call it?) comes from audio-video implants their parents have gleefully implanted into their unborn embryos. So, their entire lives are being recorded in a kind of logical cautionary conclusion to Gordon Bell‘s half-cocked idea of wearing a camera/microphone everywhere/always.
Let’s forget debating whether or not its a “good movie” — there are some wonderful design fiction-y things going on in here, the most obvious is the story insights into possible implications of such kinds of first-person continuous continuous attention.
There are two positions on this technology as delivered in the context of the story itself.
The first is that – it is a wonderful way to share one’s life at these memorials which only happen after you have died. People get to see the happy moments in “Rememory” ceremonies, which are like video wakes or something. You can also have these edited “best of” moments rolling on a flat screen at your fancy mausoleum, etc.
The alternate position is that it is invasive, entirely subjective because decades are edited down and all kinds of things are edited out, and it is morally wrong because was the implants are in, you can’t get them out and you can only attempt to block them through (this is cool) underground tattooing procedures that create disruptive blocking patterns rendering the technology inside your head useless or hobbled.
Some other intriguing things besides the way the drama works through the implications of these implants, and this tattooing culture and practice is the somewhat retro style of the special editing gear that “Cutters” use — the specially trained and small clique of people who do the editing. They are wooden computational editing machines, with the portable edition basically a giant laptop that looks like it is configured for Avid editing. The desktop edition takes up an entire standing desk with three wooden monitors. I thought it was curious — this use of wood materials is something you don’t often see in science fiction.
“The Final Cut” also shows these data banks — like money banks in their production design — in which are stored all the memories of people. They are stored in a proper, marble-floored bank with tenders who retrieve the cartridges of memories from safety deposit like vaults, which seems about right. These are the “raw” footage — everything that was contained and, as such, need to be protected in a trusted, safe place.
Despite the funky story — I won’t give it away — I found this to be a rather unique perspective on the personal perpetual aggregation of everything I do/see/hear mode of full-time life logging or whatever the kids call it these days. It revealed some of the no-shit implications and suggested some other concerns and issues — the things that can come back to bite you in the future, or even after you die, as well as the ability to relive something and see it from more mature eyes.
Okay, the other film I saw right before watching The Final Cut is called Eagle Eye, with Shia LaBeouf, Mike Chiklis and Billy Bob Thornton. It is mostly about having one exploding/falling/head-cracking special effect lead to another, but it is also about a computer that has slipped its moorings and is “activating” normal humans to do its evil, world-changing bidding. It’s programming just gets carried away, basically. Sort of “The Manchurian Candidate” meets “Live Free or Die Hard” meets “Colossus: The Forbin Project”. (And apropos of the revelation recently made by the artificial intelligence intelligentsia that perhaps, you know — computers may become smarter than us and maybe we should think this whole AI thing through a bit.)
“Eagle Eye” is your prototypical sci-fi thriller of this “computer-takes-over” idiom, with basically only enough strung together bits of dialog to get you to the next 12 minute action chase/smash/blast/flash-bang-grenade sequence — the effects-driven film. There isn’t much to say here except for two things apropos of small props, designed artifacts and some aspects of the production design that tell a larger story about possible near future worlds to be either created or avoided.
First, this peculiar visual coupling systems. The AI computer in the film, called Aria and played by Julianne Moore (of all people) has to look at other screens in order to do its surveillance work. Basically, it’s in an enormous silo-like chamber with zillions of displays, like the compound eyeball of a bug or a huge honeycomb. It is as if Aria’s main viewing apparatus hunts around for possible events happening in the world — only one at a time. Of course, this is a suggestive prop, meant to indicate in a material way how a computer could be putting much of the world under surveillance at once — which is a tricky bit of design work to convey to a general audience. Despite the technical preposterousness of this apparatus, there’s something compelling about it nonetheless, at least as a visual solution to something that could get tediously didactic if you told it rather than showed it.
Second — the computer voice. In both “The Final Cut” and “Eagle Eye”, there was this female computer voice that sounded identical to me — an instance of this character confusion phenomenon. Like — wait…which film am I watching that has the anxiety-generating computer robot voice?
Obviously they are not done by the same voice actors, but it got me thinking about the voices of AI computers. Curiously, recently, the voice of the robot/computer/AI in “Moon” was a man — played by Kevin Spacey. And, of course the voice of HAL in 2001 is male as well. Couldn’t think of any other men, except I think the computer in Lost in Space was sort of gender neutral if I remember, but probably closer to a eunich who smokes too much.
But this got me thinking — is there an archetype of the artificial intelligence computer voice. Does it matter the voice’s perceived gender? What gives?
Why do I blog this? Consumed with the implications of two modes of surveillance as depicted as designed fictions — the implant that life blogs every moment of our lives and the all-seeing/all-controlling fully connected to the cloud artificial intelligence computer.
Gordon Bell’s My Life Bits, and “Total Recall” book about experiences with the SenseCam and “the e-memory revolution.”
(** Ack. The painful irony that technologists borrow — perhaps without knowing it? — their book title from a cautionary science fiction film about fake memory implants to fuel their own hyped technofantasy visions of the future. I met Bell at USC once. We had lunch in the faculty club. Nice enough guy, but chock full of the expected greying luminary’s hubris. A colleague — also a heavy technogeek — simply couldn’t fathom the possibility that continuous continuous self-monitoring could possibly lead to any sort of problems. Just always be good, was about all that could be offered. Bleech. **)
Slife to tell you where all your time goes when you’re sitting in front of the computer screen. Clever self-analytics.
Nokia Everybit, a project at Nokia to use the mobile phone as a continuous continuous monitor of movement, calls, events, music listened to, WiFi and Bluetooth nodes encountered, etc., etc.