I think it’s worth thinking about this in a technical sense, not just in a political or capitalist sense: Yes, car companies want self driving cars, but self driving cars are immensely dangerous, and there’s no evidence that self driving cars will make roads safer. As such, legislation should be pushing very hard to stop self driving cars.
Also, the same technology used for self driving is used for AEB. This actually makes self-driving more likely, in that the car companies have to pay for all that equipment anyway, they may as well try and shoehorn in self driving. On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.
This means someone can get into a situation where they are:
in a car, on a road, nothing of interest in front of them
the software determines that there is an imminent crash
Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
may be hit from behind or may hit an object
Driver is liable even though they never actually pressed the brakes.
This is unacceptable on its face. Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars, not this AI automation bullshit.
Under what circumstances does being hit from behind result in liability to the lead vehicle. It’s the responsibility of the vehicle behind you to keep appropriate distance. This sounds like you’re regurgitating their talking points like a bot.
but self driving cars are immensely dangerous, and there’s no evidence that self driving cars will make roads safer.
This is a horrible take, and absolutely not true. Maybe for the current state of technology, but not as an always-true statement.
Humans are horrible at driving. It’s not hard to be better at driving than the average human. Perfect doesn’t exist, and computer-driven cars will always make some mistakes, but so do humans (and media will report on self-driving cars much more than on the thousands of vehicle deaths caused by human error). AEB and other technologies have already made cars much safer over the previous decades.
On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.
Tell me you’ve never used or tested AEB without telling me.
Dirty sensors trigger a “dirty sensor warning”, not a full emergency brake. There’s more than one sensor, and it doesn’t emergency brake on one bad sensor reading. Again, perfect doesn’t exist, but it isn’t close to the 50/50 you’re trying to portray here.
Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
Any car with AEB will also have ABS and traction control, so losing traction is unlikely. Being rear-ended is never on the liability of the front car.
Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars,
Absolutely agree on all of this. Slower speeds and safer roads make accidents less likely and less lethal, for human and computer drivers both.
As such, legislation should be pushing very hard to stop self driving cars.
Legislation should push hard for setting clear boundaries on when self-driving is good enough to be allowed on the road, and where the legal responsibilities are in case of problems. Just completely stopping it would be wasted potential for safer roads for everyone in the long run.
I think it’s worth thinking about this in a technical sense, not just in a political or capitalist sense: Yes, car companies want self driving cars, but self driving cars are immensely dangerous, and there’s no evidence that self driving cars will make roads safer. As such, legislation should be pushing very hard to stop self driving cars.
Also, the same technology used for self driving is used for AEB. This actually makes self-driving more likely, in that the car companies have to pay for all that equipment anyway, they may as well try and shoehorn in self driving. On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.
This means someone can get into a situation where they are:
This is unacceptable on its face. Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars, not this AI automation bullshit.
Under what circumstances does being hit from behind result in liability to the lead vehicle. It’s the responsibility of the vehicle behind you to keep appropriate distance. This sounds like you’re regurgitating their talking points like a bot.
I conflated two points. Driver hits something due to sudden braking = they are liable.
Driver hit from behind at high speed = dangerous for occupants. Either way no one asked the driver.
This is a horrible take, and absolutely not true. Maybe for the current state of technology, but not as an always-true statement.
Humans are horrible at driving. It’s not hard to be better at driving than the average human. Perfect doesn’t exist, and computer-driven cars will always make some mistakes, but so do humans (and media will report on self-driving cars much more than on the thousands of vehicle deaths caused by human error). AEB and other technologies have already made cars much safer over the previous decades.
Tell me you’ve never used or tested AEB without telling me.
Dirty sensors trigger a “dirty sensor warning”, not a full emergency brake. There’s more than one sensor, and it doesn’t emergency brake on one bad sensor reading. Again, perfect doesn’t exist, but it isn’t close to the 50/50 you’re trying to portray here.
Any car with AEB will also have ABS and traction control, so losing traction is unlikely. Being rear-ended is never on the liability of the front car.
Absolutely agree on all of this. Slower speeds and safer roads make accidents less likely and less lethal, for human and computer drivers both.
Legislation should push hard for setting clear boundaries on when self-driving is good enough to be allowed on the road, and where the legal responsibilities are in case of problems. Just completely stopping it would be wasted potential for safer roads for everyone in the long run.
There’s no evidence that self driving can be better. It’s purely faith.
Drivers are not horrible, rather horrible drivers can get a license. Treating cars as a right makes that worse. Self driving makes that worse.