Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami

  • PlasmaDistortion
    cake
    link
    fedilink
    English
    548 months ago

    It’s asinine that Tesla is trying to do full self driving without actually using some sort of LiDAR. Using video/photos to judge distance is just unreliable and stupid.

    • @xanu@lemmy.world
      link
      fedilink
      English
      338 months ago

      but it’s MUCH cheaper, so keeping with every other shitty idea he’s ever had, Musk was REALLY banking on Tesla engineers to make a crazy breakthrough so he could reap billions in reward.

      It worked at SpaceX because of a perfect concoction of all the best rocket scientists and engineers wanting to work at SpaceX, since it was one of the only space programs not owned by a government and could push the boundaries, the technology being possible and wildly practical to implement, and massive government subsidies.

      Tesla is in the car market, which is notoriously competitive and, while they do have massive government subsidies, they don’t have the best engineers and musk’s insistence that they “figure out” how to shove autonomous driving into a medium that simply doesn’t provide enough information drives even the better engineers away.

      I really wish my government would stop funding his ego and let his fantasy projects die already.

      • PlasmaDistortion
        cake
        link
        fedilink
        English
        98 months ago

        The tech has gotten so cheap now that there is no reason to skimp out on it.

        • Gormadt
          link
          fedilink
          English
          108 months ago

          Oh there definitely is, marginally higher profits at the cost of public safety

          A tale as old as capitalism: short term profit first, who gives a shit about later

    • @Catoblepas@lemmy.blahaj.zone
      link
      fedilink
      English
      78 months ago

      Sitting in a Tesla and watching it try to understand anything other than highway driving is so unnerving. It gets so much wrong about other cars’ direction of travel that it’s not too shocking one occasionally is plowed into or plows into someone else

    • fmstrat
      link
      fedilink
      English
      7
      edit-2
      8 months ago

      With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).

      Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).

      • @lloram239@feddit.de
        link
        fedilink
        English
        11
        edit-2
        7 months ago

        depth is reliable

        What if one of them is dirty? What if you are driving with the sun right in front of you? What on a foggy winter day? The big problem here isn’t even what the cameras are or aren’t capable of, but that there is little to no information on all the situations Tesla’s autopilot will fail in. We are only really learning that one deadly accident at a time. The documentation of the autopilots capabilities is extremely lacking, it’s little more than “trust me bro” and “keep your hands on the wheel”.

        The fact that it can’t even handle railroad crossing and thinks trains are a series of trucks and buses that blink in and out of existence and randomly change directions does not make me wanna blindly trust it.

        • @PipedLinkBotB
          link
          English
          18 months ago

          Here is an alternative Piped link(s):

          railroad crossing

          Piped is a privacy-respecting open-source alternative frontend to YouTube.

          I’m open-source; check me out at GitHub.

        • fmstrat
          link
          fedilink
          English
          -27 months ago

          Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It’s one place a camera is more versatile than our eyes.

          All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that’s not commonly used by the average person, and does not have claims of commercial FSD.

      • MeanEYE
        link
        fedilink
        English
        67 months ago

        depth is reliable

        No it’s not. World is filled with optical illusions that even our powerful brains can’t process and yet you expect two web cams to do. And depth is not the only thing that’s needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it’s a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it’s a motorcyle… first because of sound second because our brain is better at reasoning. And we’d avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.

        • @PipedLinkBotB
          link
          English
          17 months ago

          Here is an alternative Piped link(s):

          Case in point

          Piped is a privacy-respecting open-source alternative frontend to YouTube.

          I’m open-source; check me out at GitHub.

    • @topinambour_rex@lemmy.world
      link
      fedilink
      English
      08 months ago

      Using video/photos to judge distance is just unreliable and stupid.

      All depend how powerful is the computer managing the datas. A human brain does the job by example.

      • @lloram239@feddit.de
        link
        fedilink
        English
        168 months ago

        Humans don’t have the best track record when it comes to safe driving, might not be the best idea to imitate them when there is better tech around.

        • @topinambour_rex@lemmy.world
          link
          fedilink
          English
          0
          edit-2
          7 months ago

          But how many of those bad records are due to their eyes ?

          Most causes of those bad records are bad decisions( checking phone, speeding, cutting lanes, etc). It is rarely due to bad sight.

          The issue with lidar is bad weather. If it rains, or is foggy, it doesn’t work, or give weird result.

          Apparently there is some radar which can see through bad weather.

      • Chaotic Entropy
        cake
        link
        fedilink
        English
        15
        edit-2
        7 months ago

        I thought the whole point was to overcome human shortcomings, not just make a worse version of a human driver. Humans don’t even rely purely on visual cues.

  • @50MYT@aussie.zone
    link
    fedilink
    English
    41
    edit-2
    8 months ago

    Tesla FSD… Coming 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023, 2024

    At this rate Ferrari will win another championship before FSD comes out

    • @MotoAsh@lemmy.world
      link
      fedilink
      English
      198 months ago

      The sun will explode before Tesla succeeds in making full self-driving work with only basic cameras.

        • MeanEYE
          link
          fedilink
          English
          37 months ago

          Which means pretty much nothing. Perception is just perception, not reliable absolute data.

          • MeanEYE
            link
            fedilink
            English
            17 months ago

            Tesla doesn’t not have a radar. Just two cameras and they removed the radar. So it’s blind right now.

  • AutoTL;DRB
    link
    fedilink
    English
    98 months ago

    This is the best summary I could come up with:


    A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.

    The lawsuit, brought by Banner’s wife, accuses the company of intentional misconduct and gross negligence, which could expose Tesla to punitive damages.

    The ruling comes after Tesla won two product liability lawsuits in California earlier this year focused on alleged defects in its Autopilot system.

    “It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” the judge wrote.

    Bryant Walker Smith, a University of South Carolina law professor, told Reuters that the judge’s summary of the evidence was significant because it suggests “alarming inconsistencies” between what Tesla knew internally, and what it was saying in its marketing.

    “This opinion opens the door for a public trial in which the judge seems inclined to admit a lot of testimony and other evidence that could be pretty awkward for Tesla and its CEO,” Smith said.


    The original article contains 462 words, the summary contains 195 words. Saved 58%. I’m a bot and I’m open source!