Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami

  • PlasmaDistortion
    link
    fedilink
    English
    548 months ago

    It’s asinine that Tesla is trying to do full self driving without actually using some sort of LiDAR. Using video/photos to judge distance is just unreliable and stupid.

    • @xanu@lemmy.world
      link
      fedilink
      English
      338 months ago

      but it’s MUCH cheaper, so keeping with every other shitty idea he’s ever had, Musk was REALLY banking on Tesla engineers to make a crazy breakthrough so he could reap billions in reward.

      It worked at SpaceX because of a perfect concoction of all the best rocket scientists and engineers wanting to work at SpaceX, since it was one of the only space programs not owned by a government and could push the boundaries, the technology being possible and wildly practical to implement, and massive government subsidies.

      Tesla is in the car market, which is notoriously competitive and, while they do have massive government subsidies, they don’t have the best engineers and musk’s insistence that they “figure out” how to shove autonomous driving into a medium that simply doesn’t provide enough information drives even the better engineers away.

      I really wish my government would stop funding his ego and let his fantasy projects die already.

      • PlasmaDistortion
        link
        fedilink
        English
        98 months ago

        The tech has gotten so cheap now that there is no reason to skimp out on it.

        • Gormadt
          link
          fedilink
          English
          108 months ago

          Oh there definitely is, marginally higher profits at the cost of public safety

          A tale as old as capitalism: short term profit first, who gives a shit about later

    • @Catoblepas@lemmy.blahaj.zone
      link
      fedilink
      English
      78 months ago

      Sitting in a Tesla and watching it try to understand anything other than highway driving is so unnerving. It gets so much wrong about other cars’ direction of travel that it’s not too shocking one occasionally is plowed into or plows into someone else

    • fmstrat
      link
      fedilink
      English
      7
      edit-2
      8 months ago

      With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).

      Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).

      • @lloram239@feddit.de
        link
        fedilink
        English
        11
        edit-2
        8 months ago

        depth is reliable

        What if one of them is dirty? What if you are driving with the sun right in front of you? What on a foggy winter day? The big problem here isn’t even what the cameras are or aren’t capable of, but that there is little to no information on all the situations Tesla’s autopilot will fail in. We are only really learning that one deadly accident at a time. The documentation of the autopilots capabilities is extremely lacking, it’s little more than “trust me bro” and “keep your hands on the wheel”.

        The fact that it can’t even handle railroad crossing and thinks trains are a series of trucks and buses that blink in and out of existence and randomly change directions does not make me wanna blindly trust it.

        • @PipedLinkBotB
          link
          English
          18 months ago

          Here is an alternative Piped link(s):

          railroad crossing

          Piped is a privacy-respecting open-source alternative frontend to YouTube.

          I’m open-source; check me out at GitHub.

        • fmstrat
          link
          fedilink
          English
          -28 months ago

          Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It’s one place a camera is more versatile than our eyes.

          All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that’s not commonly used by the average person, and does not have claims of commercial FSD.

      • MeanEYE
        link
        fedilink
        English
        68 months ago

        depth is reliable

        No it’s not. World is filled with optical illusions that even our powerful brains can’t process and yet you expect two web cams to do. And depth is not the only thing that’s needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it’s a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it’s a motorcyle… first because of sound second because our brain is better at reasoning. And we’d avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.

        • @PipedLinkBotB
          link
          English
          18 months ago

          Here is an alternative Piped link(s):

          Case in point

          Piped is a privacy-respecting open-source alternative frontend to YouTube.

          I’m open-source; check me out at GitHub.

    • @topinambour_rex@lemmy.world
      link
      fedilink
      English
      08 months ago

      Using video/photos to judge distance is just unreliable and stupid.

      All depend how powerful is the computer managing the datas. A human brain does the job by example.

      • @lloram239@feddit.de
        link
        fedilink
        English
        168 months ago

        Humans don’t have the best track record when it comes to safe driving, might not be the best idea to imitate them when there is better tech around.

        • @topinambour_rex@lemmy.world
          link
          fedilink
          English
          0
          edit-2
          8 months ago

          But how many of those bad records are due to their eyes ?

          Most causes of those bad records are bad decisions( checking phone, speeding, cutting lanes, etc). It is rarely due to bad sight.

          The issue with lidar is bad weather. If it rains, or is foggy, it doesn’t work, or give weird result.

          Apparently there is some radar which can see through bad weather.

      • Chaotic Entropy
        link
        fedilink
        English
        15
        edit-2
        8 months ago

        I thought the whole point was to overcome human shortcomings, not just make a worse version of a human driver. Humans don’t even rely purely on visual cues.