Autopilot is overconfident. Had this been a fully-autonomous car, a best-guess approximation of the lane based on one marking would be the right decision. But it's not fully autonomous, and it needs to stop pretending like it is.
See, a semi-autonomous system needs to be quicker to call the driver back into direct control. Waiting until both lane markings disappear is probably too late, so when you lose a marking, the car needs to tell the driver that Autopilot is out of its depth. That's what I've experienced in Volvos, BMWs and Lexuses. But time and time again, the Tesla seemed more concerned with looking like it knew what it was doing than keeping me safe.
I continued to test the system on my ride back from Detroit. I went to test the auto-lane change feature, and as the car moved into the lane I noticed a black Tahoe coming up fast. I jerked the car back into the original lane of travel and avoided the incident, but the car never seemed to register it was moving itself into the path of a speeding, monstrous SUV.
Obviously, the car wasn't getting a lot of good information from it's rear-facing sensors. And it's not hard to see why — Tesla uses ultrasonic sensors to detect vehicles in its blind spot, rather than the more ubiquitous radar. That means limited range and limited visibility.
If the car can't see a Tahoe careening down on its keester, is it really fair to say it has all the hardware necessary for self driving? Maybe once Tesla turns on more of the auxiliary cameras the car will get a better view of what's going on, but for now I'm skeptical.
Read Again https://www.cnbc.com/2017/11/13/tesla-autopilot-review-i-still-dont-trust-it.htmlBagikan Berita Ini
0 Response to "Here's why I still don't trust Tesla Autopilot"
Post a Comment