1. 24
  1.  

  2. 18

    Frankly, investing 2 ETH in an attempt to prevent a hypothetical future AI from torturing you in all eternity makes a lot more sense than most other cryptocurrency pitches.

    1. 9

      “Readers interested in Lisp, compilers, or Ethereum will enjoy this article.”

    2. 4

      So if a bunch of people decide to fork their own version of Roko’s Ransomware, which one should I pay protection fee to to not be tortured for eternity?

      1. 3

        Addressed in the Charlie Stross blog post I referenced in a comment in this thread:

        why should we bet the welfare of our immortal souls on a single vision of a Basilisk, when an infinity of possible Basilisks are conceivable?

        1. 2

          Good stuff.

          There’s also the question “why should we care about hypothetical copies of ourselves in the future?” - after all, there should be hypothetical copies of ourselves in parallel universes and if the present universe is infinite, there be an infinity of copies of ourselves here, some portion in hell, some in heaven, some in bizarre purgatories.

          Moreover, even if you posit a god-like intelligence able to accomplish virtually anything in the future, that godlike intelligence seems unlikely to be sift through the quantum noise to create truly exact copies of ourselves (I could make reference the “no cloning” theory of quantum mechanics and etc). So the hypothetical punished copies wouldn’t even be as good as copies suffering whatever other fates might await elsewhere or elsewhen.

          It seems like the construct illustrates the difficulty humans have in separating intelligent ideas from garbage-thoughts when one is conceiving AIs (who has noticed that humans follow stated goals in a highly nuanced fashion rather than literalistic fashion? Not lesswrong it seems - or at least they haven’t consider this is key part of our being “more intelligent” than computer programs or the way we’re still better than programs).

          1. 3

            As for hypothetical copies — this version of a basilisk seems to be worded carefully enough to say that you cannot be sure if currently you are a pre-Singularity original version, or a simulated copy.

      2. 3

        You might want to move the disclaimer to the top of the page, for your own legal protection, just in case the satire’s a bit too subtle for someone.

        1. 5

          I agree to refund anyone who purchases a Basilisk Protection Charm yet ends up tormented for all eternity by a rogue AGI.

          Hopefully that warranty avoids the need for any lawsuits.

          1. 2

            I don’t see why not, code is law after all.

          2. 2

            Do you believe @MichaelBurge will be sued by someone who believes in the Basilisk?

            Or by angry cryptocoin investors?

          3. 2

            This feels like the future from Accelerando by Charles Stross

            1. 4

              The first time I heard of the Basilisk was on Stross’ blog:

              www.antipope.org/charlie/blog-static/2013/02/rokos-basilisk-wants-you.html

            2. 1

              MichaelBurge, your little “Pascal’s Wager” table is missing an attractive option: “HACK THE PLANET!”

              …from http://www.imdb.com/title/tt0113243/quotes/qt0448615, of course, but may it take on new meaning for all you woke sims out there!