1. 8
  1.  

  2. 16

    The Uber/Snapchat generation of company doesn’t have the DNA to build self-driving cars, and shouldn’t be building anything that interacts with the physical world in a way that could kill someone.

    For a contrast, while there’s plenty of sloppiness in Google around HR and product management, Google has enough of the Bell Labs/PARC heritage to have islands of excellence that PMs and ScrumLords can’t touch. That’s why Google is able to make some progress on this (very hard) problem. Mainstream Google isn’t a great place to work, but there are R&D teams that are trusted to work on real stuff and that can produce excellent results (see: AlphaGo). Even Google is finding this problem to be difficult and spending a lot of time to get it right.

    If you’re doing this kind of stuff, you not only need smart people (I’m sure that Snapchat and Uber have smart people, but that’s not my point) but you need to give them autonomy to do things right, rather than expecting things to be shipped out according to silly “sprint” deadlines.

    1. 2

      Uber’s long-term goal is to be the only logistics company that matters. I’m betting that they want to not only taxi people, but run freight and do last-mile parcel delivery too. An Uber car will drive to your house and a drone (perhaps autonomous, perhaps remotely piloted) will dispatch from the car carrying your package, drop if off on your doorstep, and ring your bell.

      You think I’m kidding.

      1. 5

        Not in the slightest. Amazon has been trumpeting its plans for this sort of mindshare for the longest time.

        Of course, wanting to be in this space is the current mindset of today’s monopoly-obsessed startups, and is a long way from being near this space.

      2. 2

        Note that the title appears to be false: the body has “(…) cars (…) drove an average of close to 0.8 miles before the safety driver had to take over (…).”

        The body text leaves room for the cars to function properly somewhere. (Adaptive) cruise control and well-aligned wheels suffice to let a car drive most highway miles “autonomously”; it’d be shockingly bad if Uber managed to do worse than that! (Of course, the actual text still isn’t terribly encouraging about the state of Uber’s efforts.)

        1. 1

          Incidents per mile is, for exactly the reason you mentioned, a bad metric for driving safety. Highway driving is easy and mostly safe. (Highway accidents are terrible, because of the high speeds involved, but they’re also much rarer.) City driving is difficult and much more dangerous, and most of the gaps between the state of autonomous driving and where we’d want it to be are in urban areas, which tend to be a lot more dynamic and therefore require human judgment.

          If you’re a cab company, you care about your city driving record. If you’re a company like Amazon, you care about both, since your “last mile” is disproportionately likely to be a city. Either way, though, it’s not hard to improve your numerical driving record by spending a lot of time on the highway.

          The other interesting (and hard-to-answer) question is how “incidents requiring human intervention” are counted. Are they reviewed? If I were a human in a self-driving vehicle, I would be inclined to err on the side of safety and therefore intervene more often than might be necessary. If self-driving cars became very good, it seems that these cases (human intervenes, but it wasn’t necessary) would dominate. So I wonder if, and how, they factor those out.