In 1982, Blade Runner saw four “replicant” androids hunting down and killing their creators. A year later, War Games predicted that a supercomputer would attempt to wipe out mankind with a game of thermonuclear war. And mere months after that, a naked Arnold Schwarzenegger materialised in Los Angeles and started shooting the place up under orders from artifical intelligence system Skynet. No wonder children of the 80s have a hard time trusting our computers.
But perhaps we’re not being so paranoid after all. Researchers at Cambridge University’s Centre for the Study of Existential Risk (CSER) are investigating the potential peril posed by advances in areas such as biotechnology, nanotechonology and artificial life, and say dismissing the possibility that machines could one day turn against us is “dangerous”.
“The seriousness of these risks is difficult to assess, but that in itself seems a cause for concern, given how much is at stake,” say the scientists on their website.
The CSER project is the brainchild of Cambridge philosophy professor Huw Price, cosmology and astrophysics expert Martin Rees and Skype co-founder Jaan Tallinn. Prof Price said the threat could come from “machines that are not malicious, but machines whose interests don’t include us,” adding that it was important for that message to reach the mainstream.
“It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology,” he said. “What we’re trying to do is to push it forward in the respectable scientific community.”
As a website, RadioTimes.com is, of course, a huge fan of technology and takes such claims with a pinch of salt. If (when), however, you do find yourself facing off against a marauding, self-aware machine in the midst of a (frankly, inevitable) robot apocalypse, we would wish you the best of luck and remind you of the following: “It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are dead.”
Here’s to the future…