2015 September 01 Tuesday
Uber Class Action Suit Great News For Autonomous Vehicle Development
A pair of news articles tell the tale of how a US District Judge in SF is spurring Uber to speed up autonomous vehicle development: Uber drivers granted class action status in lawsuit over employment and Uber Loses Bid to Block Drivers From Suing as Group for Tips.
That's great news for those of us who want to hand the driving over to computers so we can spend our travel time on other activities. Plus it will make travel safer. While I'm usually opposed to federal meddling in business decision making in this case I'm pretty excited about the potential benefit.
Since truck driver is the top occupation in most states the impact of faster autonomous vehicle development will be even bigger in trucking. Uber could make some extra money licensing their self-driving tech the truck makers. Or Google or Apple or another company involved in development self-driving tech could license to truck makers. One way or another autonomous vehicle tech will come to jobs and the truck jobs will start getting phased out in the 2020s. What will the truck drivers do them? Libertarians think more people will get jobs as personal servants. But I expect home robotic chefs and robotic house cleaners will serve us, not humans.
I think Ford and other taxi makers ought to get a clue from this decision and decide to speed up their autonomous vehicle development as well. Think of it this way: people will travel more by taxi if the labor cost of the taxi driver is removed. That will mean more miles driven and so vehicles replaced more often with new vehicles. So higher car sales. That's great for Ford and GM. They ought to get those autonomous vehicles to market as soon as they can.
Also, if people don't have to drive their own cars they'll spend more time traveling in their own cars. So miles traveled per passenger will go up. Autonomous vehicles are a great deal for any vehicle makers who can manage to make the transition ahead of most of their competitors.
What is interesting about this development: political forces are lining up for autonomous vehicles just like for automated fast food restaurants and other low paid and low skilled occupations.
By Randall Parker at 2015 September 01 08:49 PM
Re: "more people will get jobs as personal servants," that's a link to a good Mickey Kaus post. Yes, the libertarians' have their happy vision of RIFfed clerks and fry-station workers turning into chefs preparing fine meals in the homes of clients, dramatists devising elaborate entertainments, and bespoke tailors.
Furthermore, this process is already well underway. One reason there were no riots in Baltimore this past spring is that the residents of its hardscrabble neighborhoods were "too busy to hate." They just didn't have the time to feel aggrieved or lash out at the broader society.
As a libertarian myself, I steadfastly believe that misunderstanding human nature and devising counterfactual interpretations of current events should be celebrated as the essential skills of citizenship. That the public-policy and academic elites lead such comfortable lives goes to show that this is the best of all possible worlds.
I DO think there will be a small market for highly skilled personal servants for rich/famous people. There is something to be said about a good human touch in certain areas. But will only be so many of those positions. Additionally, the human personal servants will use robots/automation rather than a team as butlers (who are like home CEOs) use today.
The big problem for 'self-driving' cars (and more so for tractor-trailers) is the legal system. Granted, in San Fran, the 'google cars' have had a very good track record - no accidents so far.
But I'm looking at it from the perspective of a longtime resident of FL, have you ever been to Disney World? If so, you may remember what I-4 and US 192 (the two main arteries to Disney) are like - more often than not they can be downright scary. It can be and often is a nightmare, confused tourists, confused elderly, harried and aggressive locals and crowded roads.
Traffic is not much different throughout the state. Now insert self-driving vehicles, they will get involved in wrecks for one simple reason, computers don't deal with random events too well - like the tourist suddenly crossing from the left lane to the right turn lane to go to a restaurant they like (seen that plenty of times) or a service employee that's trying to make time through thick traffic so they won't be late for their shift (seen that plenty of times too).
When the inevitable happens, the question becomes, ''whose fault is the crash?''
The idiot who cut off the self-driving vehicle? The car builder (or the company who retrofit the self driving equipment)? The software developers? The human behind the wheel of the self driving car who failed to seize control of the car - or did? All of the above? Cops, courts, attorneys and insurance companies will demand answers.
The biggest problem for self-driving cars is not the legal system. The biggest problem is that the computer scientists have not correctly specified the problem with driving correctly, namely, 1) turn signals are not a right-of-way, and 2) lane changes are prioritized randomly.
The current self-driving cars assume that, once a turn signal is given, changing lanes is a matter of distance from and velocity of the nearest object around it. The Google car can detect an empty space in which to move, but it cannot account for the driver accelerating to close the space and thus not allowing the Google car to change lanes, causing an accident. Human drivers are aware that others may close this distance so they rely on a wide variety of heuristics to deal with this problem. For example, changing lanes in front of a sports car or a luxury car with young drivers may be harder than doing so in front of old drivers and big, heavy commercial vehicles. Or, they may rely on maintaining comfortable speeds and distances based on the drivers own self-assessment of his comfort and abilities.
The point is that human traffic patterns are a living thing. There is a huge variety of driving styles that are more or less randomly distributed and these randomly distributed driving styles are what make up human traffic. Self-driving cars are, instead, guided by algorithms with, probably, a small set of parameters. Algorithmic driving will radically alter traffic patterns and probably lead to more congestion and longer commute times.
Consider this problem: you're driving in the left lane on I-94. You need to take an exit that is on your route. Your exit will be in the right lane. How close will you typically be to the exit before you attempt to pass into the right lane? Will you be 10 miles from the exit? 5 miles? 2 miles? No one can really answer this. People arrive at different solutions and the solutions are distributed randomly. Some will be 2 miles; some will be 10 miles.
Now, how is a self-driving car going to handle this problem? Let's say it's hard-coded to move into the right-lane 10 miles before the exit, if traffic conditions allow. What's going to happen to traffic if you have a dozen or two dozen of these cars all attempting to make that lane-change based on an algo? Traffic conditions will worsen. The scale of this problem gets worse the more autonomous cars you have on the roads. In fact, a city full of only autonomous cars resembles something we already have...trains. And what do trains omit? On-demand transportation.
No, self-driving cars will not work for more than a very limited set of applications if you don't want worsening traffic. It's like flight. It is too difficult to commodify.
None of the objections you make are actually problematic.
An autonomous vehicle can easily account for another driver accelerating to close space. Autonomous vehicles are equipped with sensors that are always monitoring the road and the positions of other vehicles. They are not like robots sitting in a driver seat that have to turn their heads around to monitor the state of traffic and then make a decision later on based on a one time snapshot (like humans). Internally all of the vehicles in the vicinity will be tracked and their velocity will be mapped. When a vehicle accelerates to close the gap and prevents you from merging into a lane, it is not some magic event that happens instantaneously. It should be easy for a vehicle with good sensors to detect the change in relative velocities indicating a sudden acceleration and react accordingly to avoid a collision.
If an autonomous vehicle takes measurements all of the road and vehicles around it twice a second it is already outperforming even the most aware human drivers. More realistically, an autonomous vehicle using optical, laser, and radar sensors is doing this a hundred or more times a second, using a range of sensory mechanisms. As humans we develop mental models of what other drivers are thinking to try and understand their behavior, but a computer that is purely focused on processing and has sensors that are vastly more specialized than our own does not need to do this. The more difficult and fundamental problem is to ensure that the computer has an accurate perception of the world in the first place.
The example I used of the tourist going from the passing lane to the right-hand turn lane is NOT something that happens over miles, it's something that happens over feet. As in ''Hey there's a McDonald's!'' and zip there they go! As fast as you can speak the words between the quote marks. Like I said, computers don't handle random events very well, after that confused tourist the software might have a 'buffer overflow' and crash - and then the car will (physically) crash. Don't think software can't do that: http://www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/
It gets worse, the software in a 'self driving' tractor-trailer BSOD's... At 70 mph... And plows into an school bus full of kids returning from an away game. You don't think there's going to be hell to pay?
I can see the TV reporters now: ''THE SANDY HOOK OF THE HIGHWAY!!!''. Who will be watching that? State legislators, the people who can pass laws making self-driving vehicles/equipment illegal on public roads.
In California (the only state that has addressed self-driving so far) the law reads ''The (human) driver must be able to take control of the vehicle at any time''. Any failure of self-driving software or hardware could happen in a second - with no warning to the human driver, or the human, assuming 'the computer is taking care of it' and is totally absorbed in a conversation with fellow passengers, or the game on their phone, or maybe even reading this post, and is delayed in reacting - or never does.
Again, it's almost absurd to think that a human with reaction times > 0.2 ms is going to observe and react to another vehicle making a sudden maneuver any better than a specialized autonomous program. The problem for the autonomous vehicle is correctly processing data to receive an accurate picture of the world. With good sensory data your autonomous driver will detect the turn of the other vehicle's wheels and their acceleration in a new vector well before they even leave their own lane. I don't think we are there yet with the sensory processing, but an autonomous driving program with enough data will handle this same scenario far more competently as a driver than a human will.
Anyway, the point was not that autonomous driving is somehow going to be perfect, but that the difficulties with it are surmountable. The objections you keep raising are mostly alarmism.
Don't bother with the current partial implementations, like letting people approve a lane change. Wait for full-blown systems to appear within the next five years.
Many people seem to think that autonomous cars need to be perfect to take over. Not true. As soon as insurance companies can compare a billion vehicle-miles driven by humans and a billion vehicle-miles driven by autonomous cars, the battle will be over.
Five years after that, every car that costs more than $20,000 will have autonomous driving enabled.
There is no doubt in my mind that autonomous cars will lead to an explosion in miles driven. Every retired person in America will load up a car with friends, set the directions for a resort or hotel within a day's drive, and off they go. Every day, a different golf course, motel, or friend's house or apartment. No drivers needed.
IF autonomous vehicles actually work, the ten million people who make their living driving something will be out of work at a rate of about ten percent per year, that being the rate at which human-driven vehicles can be replaced, starting in about five years.
Bill and tooobvious,
Again, this is the problem. The problem of driving is not an exercise in machine vision and data processing. That's why it will ultimately fail. Humans rely on their knowledge of other humans when negotiating vehicles on the road. It's like the difference between a robotic arm picking a peach and a human picking a peach. The robotic arm tests for softness, color, size, and many other parameters to pick a peach. The human picks a peach based on what it likes to eat. These are entirely different heuristics.
Furthermore, none of you have addressed the basic objection I brought up: millions of self-driving cars using more or less the same driving algorithm are going to cause massive slow-downs in traffic. There will be no increase for the same reason why public transportation does not relieve traffic congestion: the traffic problem is shifted to something else. Trains merely shift the congestion problem to loading and unloading people quickly. Self-driving cars shift the traffic problem to a routing algorithm: what gets priority when objects can get in the way of each other? See, this is an optimal queuing problem that humans handle through a wide variety randomly distributed driving styles. The machine will not improve on that.