Another notch on Tesla's handgrips... | FerrariChat

Another notch on Tesla's handgrips...

Discussion in 'Technology' started by bitzman, Dec 10, 2019.

This site may earn a commission from merchant affiliate links, including eBay, Amazon, Skimlinks, and others.

  1. bitzman

    bitzman F1 Rookie
    BANNED

    Feb 15, 2008
    3,287
    Ontario, CA
    Full Name:
    wallace wyss
    from My Car Quest (with permission of the author)
    Another Tesla Driver’s Family Sues The Company

    Another Tesla has killed its driver (on March 1, 2019). This is the fourth known person to die while using Autopilot, and his family is the second to sue Tesla over a fatal crash involving the technology. It has already been determined the Autopilot system was on.

    Again a tractor trailer was cutting across its bow, to use sailor talk, and the Tesla didn’t see it, and tried to go under it. Virtually identical to the first such death reported, back in May 2016, where a truck was cutting across the Tesla’s path, killing the Tesla’s driver.

    In his defense Musk says that the system isn’t automatic–it requires that drivers remain attentive and ready to take control of the car. But it’s obvious owners are putting too much faith in it. After all, it is seemingly viceless, so why not relax and let the car do the steering?

    And yet Musk keeps the system’s name “Autopilot” making it seem like the Autopilot in an airplane where it truly can fly on its own. Why can’t Musk just admit his car’s need another supplementary system to spot trucks? Those four drivers died because of Musk’s stubbornness, first his stubbornness to change the name to something that is not such a blatant lie and secondly because he wants the name Autopilot to be a sales attraction to potential drivers from thinking the perfect system already exists, that it has been invented and is already available on Tesla vehicles.

    Tesla might argue in court that on the latest crash, their system did not detect the driver’s hands on the wheel for the eight seconds before the crash but I say again it’s the name of the feature “AUTOpilot” that implies it is being automatically piloted. Not to worry, right?

    Image Unavailable, Please Login
    Image from Tesla

    It is true that in this latest accident, the owner was speeding, going 68 mph when the speed limit was 55 mph, and failed to make any last minute evasive maneuvers. What happened next is gruesome. As the car went under the truck, or to be more exact, the car body went under but, like the first Tesla death the roof didn’t make it, shorn off.

    Copywriters must be in short supply at Tesla’s ad agency that no one can think of a name for this dubious feature that doesn’t lead people astray as to its capabilities.

    It is difficult to re-recreate an accident but it could be done. If the Tesla Autopilot had signaled a danger, I think it could be proven that the driver still had a few seconds to stop. But that driver was trusting the Autopilot to keep him out of trouble.

    I compare the Autopilot system on a Tesla to a guest you are reluctant to invite to a party. What if someone told you that you should invite Richard, who is well groomed, erudite, and indeed charming but the rumor is that, every few months at a party, he takes out a knife and cuts another guest’s head off? Gives you a moment of pause, doesn’t it?
     
  2. Jaguar36

    Jaguar36 Formula Junior

    Nov 8, 2010
    834
    Cherry Hill, NJ
    An end user used a product in a manner other than prescribed by the manufacture. Just like all the accidents caused by people talking on their cell phones while driving. Tesla is not to blame here, the end user is, just like in every other crash when someone is driving.

    This also ignore the countless lives that have been saved by autopilot reacting faster, having constant vigilance or being able to see what humans cannot.
     
    ghibliman and BOKE like this.
  3. mavila

    mavila Rookie

    Jun 13, 2019
    17
    Bay Area
    Full Name:
    mavila
    What Jaguar said . . .
     
  4. LVP488

    LVP488 F1 Rookie

    Jan 21, 2017
    4,839
    France
    An "autopilot" that requires you to be attentive all the time is completely useless - even if it's sold as such.
    If you are actually attentive, why on earth would you need an autopilot?
    While legally Tesla has a point, they are not really honest - in disclaimers they say (in other words) that their system is useless, while in marketing they claim it's worthwile.
    I do not know about this latest accident, but IIRC in the 2016 accident, the system actually detected the truck - but failed to identify it as a truck (or a real danger).
    It has also to do with the willingness to make the system appear smarter than it actually is, which again is not really honest in my view.
    The thing is that typically the systems attempting to self drive have a percentage of unrecognised things, of which it's known that only a small percentage represent actual hazards.
    When the system is designed by a serious manufacturer, it will react like there is a danger all the time - most of the times by applying brakes - and thus expose it failed to correctly identify for about all the unidentified circumstances - Tesla decided to go the other way round, making a risky bet.
    To illustrate this, let's consider arbitrary percentages:
    - the system correctly identifies obstacles 90% of the time
    - 10% of the time, it does not know whether it's a real obstacle (could be one, but also a light reflection, a flying leaf or whatever)
    - of these 10%, the designers know that staistically, only 10% represent real dangers (so actually only 1% of the total).
    Now the options are:
    - be cautious (which will basically result in applying the brakes in all not-understood situations)
    - pretend to be smart (always ignore the not-understood situations)
    With the percentages above, the same system (that actually fails 10% of the times) will look to react correctly in 91% of the situations if being cautious (because it's actually correct 90% of the time, and it reacts to the 1% of actual dangerous situations - however it apparently reacts for no reason in 9% of the cases), and will look to react correctly in 99% of the situations if pretending to be smart (because the 9% of non identified cases that are not dangerous will look like they have been interpreted correctly - although they have just been ignored, like the remaining 1% that will let the accident happen).
    Of course my percentages are made up, but I cannot help thinking that Tesla is pretending to be smart along these lines.
     
  5. dustman

    dustman F1 Veteran
    Rossa Subscribed

    Jun 12, 2007
    8,935
    On my second high end Tesla.
    This one I didn’t purchase the auto driving, as imo it is pre beta and not safe enough for all driving conditions.
    SW updated on its own day before Christmas, as usual. Drove a few miles from home, as I do hundreds of times, same route, nothing special. car suddenly screams warnings, red flashing dash lights and steering wheel on its own jerks to the left towards oncoming traffic. Had to hard manhandle the wheel back to the right. Absolute wtf moment.
    go home, review new sw notes on the big screen, find some hidden new toggle and turn it the f*ck off. love Tesla’s but wtf aholes.
    there’s a reason I didn’t buy your auto crap and you force on this new feature?
    Complaint sent.
     
    sf_hombre and JaguarXJ6 like this.

Share This Page