Skip to main contentSkip to navigationSkip to navigation

Why is it so much easier to use computers in films?

This article is more than 17 years old

Because if the computers in films were like the ones in real life, people would be endlessly turning them off and on again and standing around staring at the ceiling while they waited for a progress bar to inch across the screen. In short, it would be boring as hell. After all, the only time you call people to come and watch things happening on your computer is when you've found a video of someone playing the national anthem with their armpit, isn't it?

But the filmmakers' dilemma is that computers have become essential plot devices: nobody would believe that you could have a giant corporation where the secret plans to overthrow the rule of law weren't kept in a password- protected folder marked MOST SECRET. (That's where we keep all our plans for world domination, even the out-of-date ones.) A film world without computers is almost unthinkable, since they're a sort of white-collar gun - able to exert immense power at vast distances through minimal effort by the protagonist. Yet the film reckoned (by geeks) the most realistic depiction of computer use, Antitrust, garners one review at imdb.com suggesting "No script, no plot, horrible acting ... Take your money and go home and avoid this dung heap at all costs."

Now the usability expert Jakob Nielsen has done an analysis of the top 10 usability "errors" in films, (tinyurl.com/y7erul) and points out that they fall into a pretty small group: the hero can use (and crack) any computer which can talk to absolutely any other computer, though often only after negotiating a series of frustrating but usefully huge-fonted "ACCESS DENIED" notices; computers can talk and understand you, when required; time travellers from the past and future can use our computers (at which Nielsen fulminates "taken back in time to the Napoleonic wars and made captain of a British frigate, you'd have no clue how to sail the ship: you couldn't use a sextant and you wouldn't know the names of the different sails, so you couldn't order the sailors to rig the masts"). Nielsen adds that "it's highly unlikely that anyone from 2207 would have ever seen Windows Vista screens."

(Note how he says unlikely. Does he think it'll take Microsoft 200 years to produce the next version of Windows? Even scarier, has some time traveller visited him and said, "Oh, Vista"?)

As for talking computers, which so many people think would be a great idea, those are an "audience interface" rather than a user interface. And imagine the chaos of an office full of computers babbling: "Alert. You have new mail." Though in films, your email is never, ever spam; it was never explained how in You've Got Mail neither Tom Hanks nor Meg Ryan, ostensibly AOL users, didn't have to wade through messages offering to enhance their body parts to reach their billets doux.

Nielsen is also dismissive of three-dimensional interfaces, made most impressive in the film Minority Report (from the short story by Philip K Dick, which didn't itself contain any computers at all). There, Tom Cruise waves his hands about to shove data around and find the about-to-be-baddie. In reality, says Nielsen, "it's very tiring to keep your arms in the air while using a computer"; in short, he says, "3D is for demos. 2D is for work."

The most marvellous and jawdropping plot device, of course, remains that from Independence Day, when Jeff Goldblum uploads a virus from a Mac to the aliens' starship, crashing it (and the aliens to the ground). Probably - we hope - this represents the nadir of computer-driven plotting, bottoming out the path traced by Jurassic Park (in which the young girl exclaims "This is Unix, it's easy!"). Let's just turn that one off - and not turn it back on again.

· If you'd like to comment on any aspect of Technology Guardian, send your emails to tech@theguardian.com

Explore more on these topics

Most viewed

Most viewed