6 Comments
⭠ Return to thread

Re self-driving cars, the car world and commentary about it is unrealistic on two fronts (both personified by purist dreamer Elon Musk): (1) pure electric cars, and (2) self-driving technology. The fundamental problems with both have yet to be solved, and it's not clear that they will be solved in the near- or medium-term. Meanwhile, the goals of both are better served by existing technology that either is already widespread or should become more so.

The fundamental problem with pure electric cars is range and "fill-up" time. A pure electric car is inadequate for a long car trip or for the many people without a garage (i.e., apartment dwellers, all those people whose cars line every city street every night). Pure electric car take-up is slowing, not accelerating, contrary to the car companies' irrationally exuberant predictions, indicating that most people who are interested in them -- a relatively small percentage -- have already bought one.

We already have the solution: hybrids. Plug-ins use no gas on a daily commute and are perfect for those with a garage. Non-plug-ins, regular hybrids like the Prius, roughy double typical mileage, and are perfect for those apartment dwellers. They sacrifice little-to-nothing in terms of performance and come free of range anxiety, using a reliable and mature technology. Policy-makers looking to internal combustion engine bans are making a big mistake. They are incentivizing keeping regular old gas cars on the road much longer -- very doable, insofar as they are super reliable nowadays and it costs ever more to buy a new car. The better policy is to tweak gas mileage mandates to essentially require the mass hybridization of new car fleets. Hybrids don't rely on ultra-large batteries requiring scarce natural resources. They won't massively disrupt an industry that employs many thousands. Companies could standardize battery size, shape, and other aspects to move to more uniform production of smaller hybrid-sized batteries which would bring down the overall cost of hybrids and hasten a world where replacing your hybrid battery after 100K or whatever would cost hundreds, not thousands (a hang-up for cost-conscious shoppers of used hybrids).

The fundamental problem with self-driving technology is that it still makes regular mistakes and is routinely flummoxed. Think about your driving. How often do you encounter odd situations? All the time. As a diver, I, for one, will not tolerate auto-driving technology with anything like the glitchiness of the rest of my tech, which fails to do what I want on a regular basis. These things have to work perfectly nearly 100% of the time in all conditions and all places where you might want to drive. My impression is that that remains a very tall order, which is why you haven't heard a whole lot about self-driving cars in a while, and why their investors are nervous. The idea that self-driving cars will, on average, be safer, is cold comfort. It would be particularly galling to have a loved one, say, who is a very careful driver and has never had an accident, be killed by their robot car's mistake. I can say confidently that a self-driving car for *me* would not represent a safety improvement but a downgrade. I think many millions will probably have the same reasonable view, and not want to hand over the wheel.

Meanwhile, once again, amazing technology is at the ready to come close to solving the problem, but it gets little attention because it's not as flashy or pure. I refer to the suite of guardian angel tech that comes standard now on most new cars -- automatic emergency braking, lane-departure warnings, back-up cameras, adaptive cruise, and the like -- as well as low-grade highway self-driving, like GM's SuperCruise and what I imagine will be the next step -- cars able to avoid accidents by intervening at the last moment to command not just brakes but steering and gas as well. Excessive speed interventions are doable, especially for young drivers, as is ignition/breathalyzer interlock, which, assuming reliability, I'd far sooner accept than cars that take over the whole thing.

Expand full comment

It seems to have become a national past time for people to say the see the future. At the moment it is self-driving cars and electrifying everything. IMO both of those things are at best a very distant future, but I am, after all, the Skeptic. I remember is 2015 many people were saying that self-driving cars will be in production by 2020, and transit would radically change right away.

Elon Musk seems like he is seeing the future, but I think he just says things on social media that he thinks are funny. Example: building colonies on Mars.

Expand full comment

While it is 2023 (not 2020), I visited Phoenix last weekend, and was surprised to see what appeared to me to be Google Maps camera/mapping cars all over the place. It was explained to me that downtown Phoenix and parts of Scottsdale, Tempe, Mesa, and Chandler, AZ are part of Waymo's large test area for self-driving cars. My friends have themselves used Waymo as a taxi on more than one occasion. (Although they admitted that they were not all that comfortable doing so.) I came away thinking that self-driving cars are not that far off.

Expand full comment

Waymo and Cruise have been testing in San Francisco for even longer, and they have started in Santa Monica. I am not an engineer; I am going off of reports I have read. What I have read is that the lidar system gets confused by rain. Also, it cannot reliably see motorcycles or bicycles. Maybe those challenges have been overcome. This is part of why they have test cities. The rain problem makes Arizona a good testing ground.

Ethical/legal/policy challenges are not things that technology can solve.

Should the driverless car algorithms be designed to minimize loss of life or injury?

If the answer is "yes", then what happens when a driverless car with one passennger is on a multi-lane highway behind a truck which suddenly stops or has a large part of its load drop off the back. There is not enough time to stop before hitting it. The road is crowded and there are cars on both sides with multiple passengers. The minimize injury algorithm would have the car choose to hit the road hazard putting risk only to its sole occupant rather than risk hitting more cars.

From a legal perspective, how is liability determined when there is a crash? Also, DUI laws would need to be updated. Typically, they state that it is illegal to operate a vehicle while impaired by alcohol or a controlled substance. If I summon a Waymo with an app on my phone when I am drunk and there is a crash, will the court determine that "was operating the vehicle" because I summoned it with an app? Would it matter if the vehicle provided the option for me to take control?

Should driverless cars have the ability to be controlled remotely by police and fire departments? One case I read about is a driverless car came to a stop because there was a fire truck nearby, but it stopped on top of a fire hose, which interfered with fire fighting. Adding remote control raises multiple questions. A big one is how to ensure the system cannot be hacked by people with bad intent. We continually see very high value targets owned by entities which have all of the resources they could want to secure their systems fall victim of ransomware. There is no doubt that hackers would try to get into self-driving car remote control systems. Are we ready to trust them?

Expand full comment

If you google "self-driving cars further away than you think," you get articles saying that from 2013 all the way through 2020, and I think the below very recent article is a good statement about the state of the art today, which, while cheerleading the efforts still ongoing, is basically in the same vein:

https://www.theverge.com/2023/5/5/23711586/autonomous-vehicle-investment-toyota-nvidia

True, widespread self-driving seems decades away and may be, as one source in that article put it, "basically impossible." Investments in self-driving companies have plummeted, many have closed, etc. Companies are focusing on more limited applications and enhancing automated driver assistance, along the lines I suggested, which will make the road far safer without having to close the gap between nifty and *usually* impressive performance in limited, mapped playgrounds today and a world where everyone could really rely on these things all the time in any locations and any conditions.

Expand full comment

To be fair that prompt to google is expected to give results that say that we are not close. The Atlantic published an article a few days ago stating that we are close. I don't think we are, and the article mainly discusses test environments. It also does point out challenges.

https://www.theatlantic.com/technology/archive/2023/10/robotaxi-services-self-driving-cars-national-rollout/675659/

Expand full comment