• SkyezOpen@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        It started out promising, then was consigned to be shit when Elon swore off LIDAR. If he kept his shitty little hands away from management and let the engineers do their thing, it could’ve been great.

  • atlien51@lemm.ee
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    13 hours ago

    Elongated Musketon: UM THAT WAS JUST 1 FAULTY MODEL STOP CHERRY PICKING GUYS JUST BUY IT!!!1

      • hydroptic@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 hours ago

        I mean, he did specifically come up with his idiotic “Hyperloop” concept to kill California’s high speed rail project

    • lsibilla@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      9
      ·
      11 hours ago

      For as much as I’d like to see Tesla stock crash these days, and without judging on the whole autonomous car topic, this IS cherrypicking.

      Human drivers aren’t exactly flawless either, but we won’t ban human driven cars because some acts recklessly or other had a seizure while driving.

      If statistically self driving cars are safer, I’d rather have them and reduce the risk of coming across another reckless driver.

      • leftytighty@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        yes we should be doing more to reduce driving, it’s relatively unsafe and I’m sick of our lived environments being designed for cars and not people.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      The thing that strikes me about both this story and the thing you posted is that the people in the Tesla seem to be like “this is fine” as the car does some pretty terrible stuff.

      In that one, Tesla failing to honor a forced left turn instead opting to go straight into oncoming lanes and waggle about causing things to honk at them, the human just sits there without trying to intervene. Meanwhile they describe it as “navigation issue/hesitation” which really understates what happened there.

      The train one didn’t come with video, but I can’t imagine just letting my car turn itself onto tracks and going 40 feet without thinking.

      My Ford even thinks about going too close to another lane and I’m intervening even if it was really going to be no big deal. I can’t imagine this level of “oh well”.

      Tesla drivers/riders are really nuts…

  • snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    229
    arrow-down
    3
    ·
    1 day ago

    Paraphrasing:

    “We only have the driver’s word they were in self driving mode…”

    “This isn’t the first time a Tesla has driven onto train tracks…”

    Since it isn’t the first time I’m gonna go ahead and believe the driver, thanks.

    • Pika@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      148
      arrow-down
      1
      ·
      1 day ago

      Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn’t go public saying it wasn’t means that it was in self-drive mode and they want to save the PR face and liability.

      • IphtashuFitz@lemmy.world
        link
        fedilink
        English
        arrow-up
        93
        ·
        1 day ago

        I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.

          • Pika@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            23
            ·
            23 hours ago

            Dude, in today’s world we’re lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that’s registered via the cars systems. That way they can make better decisions regarding people’s car insurance.

            Nowadays it’s a red flag if you join a car insurance and they don’t offer to give you a discount if you put something like drive pass on which logs you’re driving because it probably means that your car is already getting that data to them.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        44
        arrow-down
        2
        ·
        1 day ago

        I’ve heard they also like to disengage self-driving mode right before a collision.

        • sylver_dragon@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          18
          ·
          1 day ago

          That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it’s unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a “fail safe” option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds. While an emergency stop isn’t always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.

          • zaphod@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            55
            arrow-down
            3
            ·
            edit-2
            1 day ago

            That actually sounds like a reasonable response.

            If you give the driver enough time to act, which tesla doesn’t. They turn it off a second before impact and then claim it wasn’t in self-driving mode.

            • whotookkarl@lemmy.world
              link
              fedilink
              English
              arrow-up
              16
              arrow-down
              1
              ·
              22 hours ago

              Not even a second, it’s sometimes less than 250-300ms. If I wasn’t already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic

          • nthavoc@lemmy.today
            link
            fedilink
            English
            arrow-up
            12
            ·
            23 hours ago

            So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds.

            I have seen reports where Tesla logic appears as “Human take the wheel since the airbag is about to deploy in the next 2 micro seconds after solely relying on camera object detection and this is totally YOUR fault, kthxbai!” If there was an option to allow the bot to physically bail out of the car as it rolls you onto the tracks while you’re still sitting in the passenger seat, that’s how I would envision how this auto pilot safety function works.

          • elucubra@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            1 day ago

            I don’t know if that is still the case, but many electronic stuff in the US had warnings, with pictures, like “don’t put it in the bath”, and the like .

            People are dumb, and you should take that into account.

        • GreenBottles@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          16
          ·
          edit-2
          1 day ago

          That sounds a lot more like a rumor to me… it would be extremely suspicious and would leave them open to GIGANTIC liability issues.

          • catloaf@lemm.ee
            link
            fedilink
            English
            arrow-up
            35
            ·
            1 day ago

            In the report, the NHTSA spotlights 16 separate crashes, each involving a Tesla vehicle plowing into stopped first responders and highway maintenance vehicles. In the crashes, it claims, records show that the self-driving feature had “aborted vehicle control less than one second prior to the first impact”

            https://futurism.com/tesla-nhtsa-autopilot-report

          • sem@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            19
            arrow-down
            1
            ·
            1 day ago

            It’s been well documented. It lets them say in their statistics that the owner was in control of the car during the crash

          • ayyy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            edit-2
            1 day ago

            How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.

            • Pika@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              23 hours ago

              This right here is another fault in regulation that eventually will catch up because Especially with level three where it’s primarily the vehicle driving and the driver just gives periodic input It’s not the driver that’s in control most of the time. It’s the vehicle so therefore It should not be the driver at fault

              Honestly, I think everything up to level two should be drivers at fault because those levels require a constant driver’s input. However, level three conditional driving and higher should be considered liability of the company unless the company can prove that the autonomous control, handed control back to the driver in a human-capable manner (i.e Not within the last second like Tesla currently does)

    • Mouselemming@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      2
      ·
      edit-2
      1 day ago

      Since the story has 3 separate incidents where “the driver let their Tesla turn left onto some railroad tracks” I’m going to posit:

      Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.

      Prove me wrong, Tesla

      • Tarquinn2049@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        23 hours ago

        Map data obtained and converted from other formats often ends up accidentally condensing labeling categories. One result is train tracks being categorized as generic roads instead of retaining their specific sub-heading. Another, unrelated to this, but common for people that play geo games is when forests and water areas end up being tagged as the wrong specific types.

        • Mouselemming@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          23 hours ago

          Aha. But that sounds correctable… So not having any people assigned to checking on railroads and making sure the system recognizes them as railroads would be due to miserliness on the part of Tesla then… And might also say something about why some Teslas have been known to drive into bodies of water (or children, but that’s probably a different instance of miserliness)

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        I mean …… Tesla self driving allegedly did this three times in three years but we don’t yet have public data to verify that’s what happened nor do we in any way compare it to what human drivers do.

        Although one of the many ways I think I’m an above average driver (just like everyone else) is that people do a lot of stupid things at railroad crossings and I never would

        • Mouselemming@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          I’m pretty sure Tesla self-drive does a lot of stupid things you never would, too. That’s why they want you at the wheel, paying attention and ready to correct it in an instant! (Which defeats the whole benefit of self-drive mode imho, but whatever)

          The fact that they can avoid all responsibilities and blame you for their errors is of course the other reason.

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      9
      ·
      1 day ago

      The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

      Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don’t like a company? I don’t think so. But if Tesla has proof fsd was off, we’ll know in a minute when they invade the driver’s privacy and release driving events

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        68
        arrow-down
        2
        ·
        1 day ago

        Tesla has constantly lied about their FSD for a decade. We don’t trust them because they are untrustworthy, not because we don’t like them.

        • BlueLineBae@midwest.social
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          3
          ·
          edit-2
          1 day ago

          I have no sources for this so take with a grain of salt… But I’ve heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it’s Tesla’s faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla’s fault if it was on at the time.

          • meco03211@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 day ago

            Pretty sure they can tell the method used when disengaging fsd/ap. So they would know if it was manually turned off or if the system lost enough info and shut it down. They should be able to tell within a few seconds if accuracy the order of events. I can’t imagine a scenario that wouldn’t be blatantly obvious where the tesla was able to determine an accident was imminent and shut off fsd/ap wroth enough time to “blame it on the driver”. What might be possible is that the logs show fsd shut off like a millisecond before impact/event and then someone merely reported that fsd was not engaged at the time of the accident. Technically true and tesla lawyers might fight like hell to maintain that theory, but if an independent source is able to review the logs, I don’t see that being a possibility.

            • pixeltree@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              1 day ago

              Of course they know, they’re using it to hide the truth. Stop giving a corporation the benefit of the doubt where public safety is concerned, especially when they’ve been shown to abuse it in the past

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 day ago

            They supposedly also have a threshold, like ten seconds - if FSD cuts out less than that threshold before the accident, it’s still FSD’s fault

          • SoleInvictus@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 day ago

            That would require their self driving algorithm to actually detect an accident. I doubt it’s capable of doing so consistently.

          • snooggums@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn’t a drop down of the same depth as the rails. Someone who is caught off guard isn’t going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn’t really available.

            So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren’t fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 day ago

          They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation

          • ayyy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 day ago

            It’s more about when they don’t release it/only selectively say things that make them look good and staying silent when they look bad.

      • aramis87@fedia.io
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        1 day ago

        How is a manufacturer going to be held responsible for their flaws when musk DOGE’d every single agency investigating his companies?

      • Lka1988@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 day ago

        The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

        I owned an FJ80 Land Cruiser when that happened. I printed up a couple stickers for myself, and for a buddy who owned a Tacoma, that said “I’m not speeding, my pedal’s stuck!” (yes I’m aware the FJ80 was slow as dogshit, that didn’t stop me from speeding).

    • TheKingBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      20 hours ago

      Maybe I’m missing something, but isn’t it trivial to take it out of their bullshit dangerous “FSD” mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        20 hours ago

        On some railroad crossings you might only need to go off the crossing to get stuck in the tracks and unable to back out. Trying to get out is another 30-40 feet.

        Being caught off guard when the car isn’t supposed to do that is how to get stuck in the first place. Yeah, terrible driver trusting shit technology.

    • NuXCOM_90Percent@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      I mean… I have seen some REALLY REALLY stupid drivers so I could totally see multiple people thinking they found a short cut or not realizing the road they are supposed to be on is 20 feet to the left and there is a reason their phone is losing its shit all while their suspension is getting destroyed.

      But yeah. It is the standard tesla corp MO. They detect a dangerous situation and disable all the “self driving”. Obviously because it is up to the driver to handle it and not because they want the legal protection to say it wasn’t their fault.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        At my local commuter rail station the entrance to the parking lot is immediately next to the track. It’s easily within margin of error for GPS and if you’re only focusing immediately in front of you the pavement at the entrance probably look similar.

        There are plenty of cues so don’t rolled shouldn’t be fooled but perhaps FSD wouldn’t pay attention to them since it’s a bit of an outlier.

        That being said, I almost got my Subaru stuck once because an ATV trail looked like the dirt road to a campsite from the GPS, and I missed any cues there may have been

        • XeroxCool@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          14 hours ago

          Sounds reasonable to mix up dirt roads at a campsite. Idk why the other commenter had to be so uptight. I get the mixup in the lot if it’s all paved and smooth, especially if say you make a left into the lot and the rail has a pedestrian crossing first. Shouldn’t happen, but there’s significant overlap in appearance of the ground. The average driver is amazingly inept, inattentive, and remorseless.

          I’d be amused if your lot is the one I know of where the train pulls out of the station, makes a stop for the crosswalk, then proceeds to just one other station.

          But the part of rail that’s not paved between? That should always be identifiable as a train track. I can’t understand when people just send it down the tracks. And yet, it still happens. Even at the station mentioned above where they pulled onto the 100mph section. Unreal.

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    1
    ·
    24 hours ago

    How the fuck do you let any level 2 system go 40 to 50 fucking feet down the railroad tracks.

    We’re they asleep?

    • MBech@feddit.dk
      link
      fedilink
      English
      arrow-up
      32
      ·
      23 hours ago

      I’m not sure I’d be able to sleep through driving on the railroad tracks. I’m going to guess this person was simply incredibly fucking stupid, and thought the car would figure it out, instead of doing the bare fucking minimum of driving their goddamn 2 ton heavy death machine themself.

      • 6nk06@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        12 hours ago

        I’m going to guess this person was simply incredibly fucking stupid

        Well, the guy owned a Tesla, it was pretty obvious.

    • Darleys_Brew@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      22 hours ago

      I was gonna say it’s not so much the fact that the car was hit by a train, but that it turned on to the tracks …but 40 or 50 feet?

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        22 hours ago

        Cop: WTF happened here?

        Driver: It drove itself onto the tracks

        Cop: Okay, but what about the other 49 feet of the 50 feet it’s on the tracks?

        Driver: …

  • J52@lemmy.nz
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    17 hours ago

    Hope no one was hurt, regardless whether they’re stupid, distracted or whatever! If we can’t build fail-saves into cars, what are our chances for real AI?

    • danhab99@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      14 hours ago

      Okay I don’t want to directly disagree with you I just want to add a thought experiment:

      If it is a fundamental truth of the universe, a human can literally not program a computer to be smarter than a human (because of some Neil deGrasse Tyson-esq interpretation of entropy), then no matter what AI’s will crash cars as often as real people.

      And the question of who is responsible for the AI’s actions will always be the person because people can take responsibility and AI’s are just machine-tools. This basically means that there is a ceiling to how autonomous self-driving cars will ever be (because someone will have to sit at the controls and be ready to take over) and I think that is a good thing.

      Honestly I’m in this camp that computers can never truly be “smarter” than a person in all respects. Maybe you can max out an ai’s self-driving stats but then you’ll have no points left over for morality, or you can balance the two out and it might just get into less morally challenging accidents more often ¯\_(ツ)_/¯. There are lots of ways to look at this

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        11 hours ago

        a human can literally not program a computer to be smarter than a human

        I’d add that a computer vision system can’t integrate new information as quickly as a human, especially when limited to vision-only sensing - which Tesla is strangely obsessed with when the cost of these sensors is dropping and their utility has been proven by waymo’s excellent record.

        All in all, I see no reason to attempt to replace humans when we have billions. This is doubly so for ‘artistic’ ai purposes - we have billions of people, let artists create the art.

        show me an AI system that can clean my kitchen, or do my laundry. that’d be WORTH it.