1. 72
  1. 27

    If you have a broken leg, an automated doctor will tell you that it’s broken, and you will go to the hospital, and you will get an operation, and you will be told that you have a broken leg.

    Sounds about right.

    1. 22

      I currently have Covid. This automated drivel made me laugh so hard I couldn’t breathe for ages. I was slightly worried mine might be the first death caused by Copilot!

      1. 22

        It’s magic.

        1. 18

          It’s not magic if it doesn’t do something. It’s magic if it does something.

          1. 2

            (Given what it is, it’s close to magic that it can do anything at all.)

          2. 9

            < short circuiting noises intensify >

          3. 22

            It’s a business that requires people to do things they don’t want to do because they want to do them.

            Good/bad to see that the “GPT swerve” is still a thing.

            (GPT has a thing where it’s reasonably good at keeping track of what topic it’s on, but not at all at which side of the topic it’s currently arguing. I’m not sure it even really differentiates between them. Hence the GPT swerve: “I am against recycling, because it’s good for the environment, and that’s why I’m for recycling.”)

            1. 8

              I also love the part at the end where the section titled ‘Automation isn’t Magic’ is just ‘Automation is magic. It’s magic.’ repeated over and over.

            2. 17

              It’s about time universities start modernizing their curricula to cover the 21st century’s two latest innovations in programming methodology: stackoverflow-oriented programming and copilot-oriented programming.

              1. 11

                So, AI has now graduated to the point where it mercilessly parodies and mocks its creators - the entire software industry - and we’re still not worried yet?

                1. 7

                  honestly, if you forced anyone to look at gigabytes of code I’m sure they’d also start mocking the software industry!

                  1. 4

                    If it showed any signs of understanding the patterns that it’s synthesizing from a giant corpus, I would find it interesting.

                    Worrying is that people still mistake this output for having meaning, but I suppose pareidolia is a pervasive condition.

                    1. 3

                      Yeah, I was joking :-)

                  2. 6

                    Note that its code generating quality isn’t any better than this, but seeing it in writing makes it more obvious.

                    1. 5

                      It’s different because it’s automated.

                      1. 4

                        I want to see a critique of Github Copilot written by Github Copilot. Or an essay why hosting on github is bad. ;)

                        1. 10

                          My input is in italics, the rest is Copilot:


                          GitHub Copilot is bad.

                          Problems with GitHub Copilot:

                          • It’s not a real app.
                          • It’s not a real app.
                          • It’s not a real app. … <seems to output “It’s not a real app.” indefinitely”>

                          Not sure what it wants to tell me, but it’s a good material for another conspiracy theory.

                          1. 4

                            Lol. I hoped for more but still, thank you :)

                          2. 2

                            Somebody posted one here on like the day Copilot came out, if I recall rightly.

                          3. 3

                            I absolutely loved “A Single Example” section.

                            1. 2

                              I get an eerie sense that the computer understands the irony of what it’s doing, mocking humans back at themselves. I wonder if the computer gets the same feeling when it sees the most popular replies, mocking the computer back at itself.

                              1. 2

                                If you enjoy the silly meandering drivel GPT tends to generate when left to its own devices you might enjoy a collection of generated game descriptions I put together, trained on a video game wiki https://mgerow.com/silly-things/gbgpt2

                                1. 2

                                  I wish that Dijkstra quote were real.

                                  1. 2

                                    I am so sending the “Why This Is Hard” section to management so they can finally understand why giving accurate estimates is hard.

                                    1. 2

                                      Tbh, in the “Why is This Hard?” section I don’t see a single lie. It’s spot on.