You are here

Roko’s Basilisk and the ridiculous side of technofuturism

I am interested in technology -- my degrees are in Computer Science, after all, and I still mostly pay the bills by creating software and wrangling machines. And I am interested in the future, since as the cliche goes, that's where we'll spend the rest of out lives. But hardcore technofuturism is full of pathetic, confused people, like poor Ray Kurzweil, who wants to build a computer simulation of his dead father. It's mostly old silly religious ideas dressed up in new silly science fiction. They replaced the Rapture with the Singularity, and now they're replaced Pascal's Wager with Roko’s Basilisk. It would merely be sad-to-amusing -- except that some of these people have a lot of money and influence.

The Most Terrifying Thought Experiment of All Time (Slate)

One day, LessWrong user Roko postulated a thought experiment: What if, in the future, a somewhat malevolent AI were to come about and punish those who did not do its bidding? What if there were a way (and I will explain how) for this AI to punish people today who are not helping it come into existence later? In that case, weren’t the readers of LessWrong right then being given the choice of either helping that evil AI come into existence or being condemned to suffer?

... It was a thought experiment so dangerous that merely thinking about it was hazardous not only to your mental health, but to your very fate.

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
To prevent automated spam submissions leave this field empty.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Enter the characters shown in the image.

User login

To prevent automated spam submissions leave this field empty.