1. 5

I know this isn’t super technical, but I find it utterly fascinating.

  1.  

  2. 13

    I find this type of AI journalism extremely frustrating. It gives a broad audience the suggestion that human consciousness is partially modellable in digital systems. Obviously this makes great reading for your online high brow magazine, but with gross over simplicity and recourse to story telling, betrays both the technical distance between simulation of anything remotely resembling human intelligence or consciousness and the philosophical questions about the quality of an evolved wetware consciousness running on a digital system.

    1. 4

      I’m with you. She didn’t ‘bring him back’ she created a program that approximated responses based on a corpus of her interactions with him.

      Very.. Different.. Things.

      1. 3

        gives a broad audience the suggestion that human consciousness is partially modellable in digital systems.

        What makes you think it isn’t?

        Personally, I go back and forth on this question. I haven’t found a completely satisfactory answer/argument for either side.

        1. 1

          What makes you think it isn’t?

          The fact that it hasn’t been done. This doesn’t say anything about what may be possible in the future, but articles like this make it sound like it’s possible now, and that is pure, unadulterated, stupidity.

        2. 3

          that human consciousness is partially modellable in digital systems.

          You don’t need to model human ‘consciousness’ though. You simply have to model human behaviour. You don’t even know if another human is ‘conscious’ or not. What you know is that they behave like humans.

        3. 5

          It turned out that the primary purpose of the bot had not been to talk but to listen.

          That’s a very interesting point. This is very much like the mentioned “Be Right Back”, but the problem is the emulation/facsimile/whatever you call it lacks a certain spark when improvising. Lacks creativity. But not much creativity is expected from a good listener.

          Are the pull quotes supposed to be having seizures or is that just my browser?

          1. 4

            Are the pull quotes supposed to be having seizures or is that just my browser?

            It’s not just your machine. They certainly glitch and twitch in both Safari and Firefox on my machine.

            1. 6

              It’s not just your machine. They certainly glitch and twitch in both Safari and Firefox on my machine.

              And it’s intentional, it’s done with a CSS animation they have on all .glitch elements. Why, why, why…

              1. 2

                .glitch?

                I didn’t know about this, but I still fail to see the practical use.

                1. 3

                  It’s not in the W3C standard or anything, it’s just a CSS class The Verge defined that has some wonky animations on it to make the text look twitchy. ;)

            2. 4

              It’s basically how Eliza succeeded. It’s thought of as a conversation bot but it was really just what we’d call active listening.

              1. 3

                Are the pull quotes supposed to be having seizures or is that just my browser?

                The chat messages quoted in bubbles? I think it’s just your browser, they look fine in Chrome for me.

                The way some of his friends are using the bot (to listen to their problems) isn’t unlike the original ELIZA.

                I can imagine a service which allows people to put as much of themselves as they choose (mail archive, chat logs, interactions with a query bot for this purpose) into an input corpus for a posthumous “grief counsellor” bot for their friends/family. I’d have to think long & hard before doing something like that myself, but if one existed for a friend of mine, I can see myself using it.

                1. 1

                  The pull quotes are the big text that’s like a five word snippet from the rest of the story, not the chat messages.

                  At least for me (and presumably for tedu) they’re animated to look like your screen is flickering. A lot.

                  1. 0

                    It’s an intentional effect The Verge added to all elements with the .glitch class, including pull quotes. Why, I have no idea.

                    1. 1

                      Of course they’re intentional! Note my specific choice of the word “animated”.

              2. 3

                In “Be Right Back,” a 2013 episode of the eerie, near-future drama Black Mirror,

                Black Mirror Season 3 was just announced.

                It’s a Science fiction/Satire series of standalone anthologies (6 episodes per season, you can watch in any order). If you’re interested in entirely non-scientific Futurology, and want to have thoughts teased from you about the psychological/social implications of technology, it’s a great TV show to watch.

                Not all episodes are PG-13. In fact, some of them have very adult themes. Either way, I’d suggest starting with “The Entire History of You” -> Season 1, Episode 3.