1. 16

    To quote another HN comment:

    LIDAR aside, computer vision and a raw video feed is more than enough to have prevented this collision.

    Exactly! Engineers designing autonomous cars are required to account for low-visibility conditions, even way worse than what this video shows (think hail, rain, dust, etc.). This was easy! And yet the car made no signs of slowing down.

    EDIT: twitter comments like this pain me. People need to be educated about the capabilities of autonomous cars:

    She is walking across a dark road. No lights even though she has a bike. She is not in a cross walk. Not the car’s fault.

    Yes it was the car’s fault. This is shocking, extraordinary behavior for an autonomous car.

      1. 9

        In reality, both the pedestrian and the car (and Uber) share some responsibility. You shouldn’t cross a four lane road at night wearing black outside of a crosswalk. A human driver is very unlikely to see you and stop. Not blaming the victim here, just saying it’s easier to stay safe if you don’t do that. However, the promise of autonomous cars with IR and LIDAR and fancy sensors is that they can see better than humans. In this case, they failed. Not to mention the human backup was very distracted, which is really bad.

        From the video I don’t think a human would have stopped in time either, but Uber’s car isn’t human. It should be better, it should see better, it should react better. Automatic collision avoidance is a solved problem already in mass-market cars today, and Uber failed it big time. Darkness is an excuse for humans, but not for autonomous cars, not in the slightest.

        She should still be alive right now. Shame on Uber.

        1. 18

          You can’t conclude that someone would not have stopped in time from the video. Not even a little. Cameras aren’t human eyes. They are much much worse in low visibility and in particular with large contrasts; like say those of headlights in the dark. I can see just fine in dark rooms where my phone can’t produce anything aside from a black image. It will take an expert to have a look at the camera and its characteristics to understand how visible that person was and from what distance.

          1. 9

            From the video I don’t think a human would have stopped in time either, but Uber’s car isn’t human.

            Certainly not when distracted by a cell phone. If anything, this just provides more evidence that driving while distracted by a cell phone, even in an autonomous vehicle, is a threat to life, and should be illegal everywhere.

            1. 9

              Just for everyone’s knowledge you’re 8 times as likely to get in an accident while texting, that’s double the rate for drinking and driving.

              1. 6

                He was not driving.

                He was carried around by a self driving car.

                I hope that engineers at Uber (and Google, and…) do not need me to note that the very definition of “self driving car” is a huge UI flaw in itself.

                That is obvious to anyone who understand UI, UX or even just humans!

                1. 5

                  She was driving . The whole point now of sitting in a driver seat for a TEST self driving car is for the driver to take over and overcome situations like this.

                  1. 6

                    No, she was not.

                    Without this incident, you would have seen soon a TV spot precisely with a (hot) business woman looking at the new photos uploaded on Facebook by her family. With a voice saying something like: ’we can bring you to those you Like”.

                    The fact that she was paid to drive a prototype does not mean she was an experienced software engineer trained to not trust the AI and to keep continuous control of the car.

                    And indeed the software choosed the speed. At that speed the human intervention was impossible.

                    Also the software did not deviate, despite the free lane beside and despite the fact that the victim had to traversate that lane, so there was enough time for a computer to calculate several alternative trajectories or even simply to alert the victim via light signaling or sounds.

                    So the full responsibility must be tracked back to people at Uber.

                    The driver was just fooled to think that he could trust the AI by an stupidly broken UI.

                    And indeed the driver/passenger reactions were part of the Uber’s test.

                    1. 2

                      Looking at your phone while riding in the drivers seat is a crime for a reason. Uber’s AI failed horribly and all their cars should be recalled, but also the driver failed. If the driver had not been looking at their phone literally any action at all could have been taken to avoid the accident. It’s the responsibility of that driver to stay alert with attention on the road not looking at your phone or reading a book or watching a film, plane pilots do it every single day. Is their attention much more diminished? Yes of course it is. Should we expect literally 0 attention from the “driver”, absolutely no we should not.

                      1. 5

                        Do you realize that the driver/passenger reactions were part of the test?

                        This is the sort of self driving car that Uber and friends want to realize and sell worldwide.

                        And indeed I guess that the “driver” behaviour was pretty frequent among the prototypes’ testers.

                        And I hope somebody will ask Uber to provide in court the recording of all the tests done so far to prove that they did not know drivers do not actually drive.

                        NO. The passenger must not be used as a scapegoat.

                        This is an engineering issue that was completely avoidable.

                        The driver behaviour was expected and desired by Uber

                        1. 4

                          You’ve gotta stop doing this black and white nonsense. Firstly stop yelling. I’m not using the passenger as a scapegoat so I don’t know who you’re talking to. The way the law was written it’s abundantly clear that this technology is to be treated as semi autonomous. That does not mean that Uber is not negligent. If you are sitting in a driver’s seat and you’re watching harry potter while your car drives through a crowd of people you should be found guilty of negligence independent of any charges that come to both the lead engineers and owners of Uber. You have a responsibility to at least take any action at all to prevent deaths that otherwise may be at no fault of your own. You can’t just lounge back while your car murders people, and in the same respect when riding in the drivers seat your eyes should not be on your phone, period.

                          Edit: That image is of a fully autonomous car, not a semi-autonomous car. There is actually a difference despite your repeated protestations. Uber still failed miserably here, and I hope their cars get taken off the road. I know better than to hope their executives will receive any punishment except maybe by shareholders.

                          1. -1

                            I guess you are not an engineer, Nor a programmer.

                            This is simply an engineering view about UI and UX (that actually are part of my daily job).

                            There’s no way that a human used to see a car drive correctly for hours will keep continuous control of the car without driving.

                            The human brain notoriously does not work that way.
                            If I drive I keep continuous attention and control of the car. If somebody else drive, I do not.

                            Also I’m stating that Uber was trying to see if people can trust autonomous cars.
                            I’m stating that the incindent was not the first time a tester was recorded while looking at the phone during self drive and that Uber knew that and expected that.

                            1. 3

                              I guess you are not an engineer, Nor a programmer.

                              This isn’t the first time you’ve pulled statements out of a hat as if they are gospel truth without any evidence and I doubt it will be the last. I think your argument style is dishonest and for me this is the nail in the coffin.

                              1. 0

                                I’m not sure I understand what you mean…

                                The UI problem is really evident, isn’t it?

                                The passenger was not perceiving herself as a driver.

                              2. 2

                                If there is “no way” a human can do this, then we’ve certainly never had astronauts pilot a tiny spacecraft to the moon without being able to physically change position, and we certainly don’t have military pilots in fighter jets continuously concentrating while refueling in air on missions lasting 12 hours or more… or… or…. truck drivers driving on roads with no one for miles…or…

                                Maybe Uber is at fault here for not adequately psychologically screening, and training its operators for “scenarios of intense boredom.”

                                1. 0

                                  You are talking about professionals specifically trained to keep that kind of concentration.
                                  And even a military pilot won’t maintain concentration on the road if her husband is driving and she knows by experience that his trustworthy.

                                  I’m talking about the actual Uber’s goal here, which is to build “self driving cars” for the masses.

                                  It’s just a stupid UI design error. A very obvious one to see and to fix.

                                  Do you really need some hints?

                                  1. Remove the car’s control from the AI and turn it into something that enhance the driver’s senses.
                                  2. Make it observes the driver’s state and forbid to start in case of he’s drunk or too tired to drive
                                  3. Stop it from starting if any of its part is not working properly.

                                  This way the responsibility of an incident would be of the driver, not of Uber’s board of directors (unless factory defects, obviously).

                                  1. 4

                                    You’re being adversarial just to try to prove your point, which we all understand.

                                    You are talking about professionals specifically trained to keep that kind of concentration. And even a military pilot won’t maintain concentration on the road if her husband is driving and she knows by experience that his trustworthy.

                                    A military pilot isn’t being asked (or trained) to operate an autonomous vehicle. You’re comparing apples and oranges!

                                    I’m talking about the actual Uber’s goal here, which is to build “self driving cars” for the masses.

                                    Yes, the goal of Uber is to build a self driving car. We know. The goal of Uber is to build a car that is fully autonomous; one that allows all passengers to enjoy doing whatever it is they want to do: reading a book, watching a movie, etc. We get it. The problem is that those goals, are just that, goals. They aren’t reality, yet. And, there are laws in which Uber, and its operators must continue to follow in order for any department of transportation to allow these tests to continue–in order to build up confidence that autonomous vehicles are as safe, or (hopefully) safer than already licensed motorists. (IANAL, nor do I have any understanding of said laws, so that’s all I’ll say there)

                                    It’s just a stupid UI design error. A very obvious one to see and to fix.

                                    So, your point is that the operator’s driving experience should be enhanced by the sensors, and that the car should never be fully autonomous? I can agree to that, and have advocated for that in the past. But, that’s a different conversation. That’s not the goal of Uber, or Waymo.

                                    The reason a pedestrian is dead is because of some combination of flaws in:

                                    • the autonomous vehicle itself
                                    • a distracted operator
                                    • (apparently) a stretch of road with too infrequent cross walks
                                    • a pedestrian jaywalking (perhaps because of the previous point)
                                    • a pedestrian not wearing proper safety gear for traveling at night
                                    • an extremely ambitious engineering goal of building a fully autonomous vehicle that can handle all of these things safely

                                    … in a world where engineering teams use phrases like, “move fast and break things.” I’m not sure what development methodology is being used to develop these cars, but I would wager a guess that it’s not being developed with the same rigor and processes used to develop autopilot systems for aircraft, or things like air traffic controllers, space craft systems, and missile guidance systems…

                                    1. 2

                                      … in a world where engineering teams use phrases like, “move fast and break things.” I’m not sure what development methodology is being used to develop these cars, but I would wager a guess that it’s not being developed with the same rigor and processes used to develop autopilot systems for aircraft, or things like air traffic controllers, space craft systems, and missile guidance systems…

                                      Upvoted for this.

                                      I’m not being adversarial to prove a point.

                                      I’m just arguing that Uber’s board of directors are responsible and must be accountable for this death.

                                      1. 3

                                        Nobody here is arguing that the board of directors should not be held accountable. You’re being adversarial because you’re bored is my best guess.

                                      2. 2

                                        Very well-said on all of it. If anyone is wondering, I’ll even add to your last point what kind of processes developers of things like autopilots are following. That’s things like DO-178B with so much assurance activities and independent vetting put into it that those evaluated claim it can cost thousands of dollars per line of code. The methods to similarly certify the techniques used in things like deep learning are in the protoype phase working on simpler instances of the tech. That’d have had to do rigorous processes at several times the pace and size at a fraction of the cost of experienced companies… on cutting-edge techniques requiring new R&D to know how to vet.

                                        Or they cut a bunch of corners hacking stuff together and misleading regulators to grab a market quickly like they usually do. And that killed someone who, despite human factors, should’ve lived if the tech (a) worked at all and (b) evaluated against common, road scenarios that could cause trouble. One or both of these is false.

                        2. 2

                          I don’t know if you can conclude that’s the point. Perhaps the driver is there in case the car says “I’m stuck” or triggers some other alert. They may not be an always on hot failover.

                          1. 11

                            They may not be an always on hot failover

                            IMO they should be, since they are testing a high risk alpha technology that has the possibility to kill people.

                    2. 4

                      The car does not share any responsibility, simply because it’s just a thing.

                      Nor does Uber, which again is a thing, a human artifact like others.

                      Indeed we cannot put in jail the car. Nor Uber.

                      The responsibility must be tracked back to people.

                      Who is ultimately accountable for the AI driving the car?

                      I’d say the Uber’s CEO, the board of directors and the stock holders.

                      If Uber was an Italian company, probably the the CEO and the boars of directors would be put in jail.

                      1. 3

                        Not blaming the victim here

                        People often say this when they’re partly blaming the victim to not seem overly mean or unfair. We shouldn’t have to when they do deserve partial blame based on one fact: people who put in a bit of effort to avoid common problems/risks are less likely to get hit with negative outcomes. Each time someone ignores one to their peril is a reminder of how important it is to address risks in a way that makes sense. A road with cars flying down it is always a risk. It gets worse at night. Some drivers will have limited senses, be on drugs, or drunk. Assume the worst might happen since it often does and act accordingly.

                        In this case, it was not only a four lane road at night the person crossed: people who live in the area on HN said it’s a spot noticeably darker than the other dark spots that stretches out longer. Implication is that there are other places on that road with with more light. When I’m crossing at night, I do two to three things to avoid being hit by a car:

                        (a) cross somewhere where there’s light

                        (b) make sure I see or hear no car coming before I cross.

                        Optionally, (c) where I cross first 1-2 lanes, get to the very middle, pause for a double check of (b), and then cross next two.

                        Even with blame mostly on car & driver, the video shows the human driver would’ve had relatively little reaction time even if the vision was further out than video shows. It’s just a bad situation to hit a driver with. I think person crossing at night doing (a)-(c) above might have prevented the accident. I think people should always be doing (a)-(c) above if they value their life since nobody can guarantee other people will drive correctly. Now, we can add you can’t guarantee their self-driving cars will drive correctly.

                        1. 2

                          Well put. People should always care about their own lifes.
                          And they cannot safely assume that others will care as much.

                          However note that Americans have learned to blame “jaywalking” by strong marketing campaigns after 1920.

                          Before, the roads were for people first.

                          1. 2

                            I just saw a video on that from “Adam Ruins Everything.” You should check that show out if you like that kind of stuff. Far as that point, it’s true that it was originally done for one reason but now we’re here in our current situation. Most people’s beliefs have been permanently shaped by that propaganda. The laws have been heavily reinforced. So, our expectations of people’s actions and what’s lawful must be compatible with those until they change.

                            That’s a great reason to consider eliminating or modifying the laws on jaywalking. You can bet the cops can still ticket you on it, though.

                        2. 3

                          In reality, both the pedestrian and the car (and Uber) share some responsibility.

                          I’ve also seen it argued (convincingly, IMO) that poor civil engineering is also partially responsible.

                        3. 3

                          And every single thing you listed is mitigated by just slowing down.

                          Camera feed getting fuzzy ? Slow down. Now you can get more images of what’s around you, combine them for denoising, and re-run your ML classifiers to figure out what the situation is.

                          ML don’t just classify what’s in your sensor feeds. They also give you numerical measures for how close your feed is to the data they previously trained on. When those measures decline,, it could be because the sensors are malfunctioning. It could be rain’/dust/etc. It could be a novel untrained situation. Every single one of those things can be mitigated by just slowing down. In the worst case, you come to a full stop and tell the rider he needs to drive.

                              1. 5

                                Another happy fastmail user here, besides all things already mentioned here they have yubico support for 2FA.

                                  1. 1

                                    Keep up the good work, I am eager to use this in the future, I will try to help out.

                                    1. 5

                                      Nice, but please comment this on the link. You know, the guys will be happy by your support/feedback, thank you ; )

                                      1. 2

                                        Haha, dumb assumption on my part they would all read lobste.rs :)

                                        1. 3

                                          We don’t use comic sans enough for that.

                                      1. 1

                                        Personally I prefer the light phone approach, seems much more effective I think. (Disclaimer: I don’t have one yet)

                                        1. 4

                                          Nice, thank you! On Computer Networking there is An Introduction to Computer Networks by Peter L Dordal.

                                          1. 1

                                            DELL, we are looking at you!!! Using OpenBSD software and not donating shit..

                                            What is Dell using OpenBSD in again? Also to those reading. Nobody is obliged to donate, but if a large company uses this software, it should have the decency to do so.

                                            https://www.youtube.com/watch?v=MpBzHyCmYEs

                                            1. 1

                                              Better fit over at barnacl.es

                                              1. 2

                                                Oh nice! Sure thing, thank you!

                                              1. 8

                                                Poor guys, but Github did it once too ! i remember they lost production database at some point.

                                                1. 2

                                                  Neocities (website hosting/open-source, 2-3 people), Drone.io (C.I/open-source, few guys) and CodePlane (Git Hosting, one guy) are qualified in these terms I think. (Disclaimer: I’m only free user of neocities and drone)