A fan of Tesla might think that the automaker just can’t catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars’ performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.
The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.
What the car can “see” is the big issue here. It’s also what Tesla bet its future on.
Musk is of course right. The “only” thing he forgot was that his vision-only model needs full human level artificial intelligence behind it to work.
Very genius.
Musk is also forgetting that humans use other senses when driving, not just their sight.
I use echolocation by screaming at other drivers.
And even then, it would only be able to drive as well as a human, and humans kill tons of people on the highways.