this post was submitted on 28 Aug 2023
301 points (96.9% liked)

Technology

58164 readers
4758 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla braces for its first trial involving Autopilot fatality::Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.

top 50 comments
sorted by: hot top controversial new old
[–] luthis@lemmy.nz 61 points 1 year ago (5 children)

The headline makes it sound like Tesla is trialing a new 'fatality' feature for it's autopilot.

[–] CmdrShepard@lemmy.one 17 points 1 year ago (14 children)

Well, someone has to invent the suicide booths featured in Futurama. Might as well be him.

load more comments (14 replies)
[–] HiddenLayer5@lemmy.ml 6 points 1 year ago* (last edited 1 year ago)

With how Elon has been acting this is a distinct possibility.

It would probably scream "Xterminate!" before running you over.

[–] FlyingSquid@lemmy.world 5 points 1 year ago (1 children)

Mortal Combat: Vehicle Edition

[–] space@lemmy.dbzer0.com 4 points 1 year ago

Carmaggedon

[–] Lucidlethargy@sh.itjust.works 5 points 1 year ago

The reality is that they didn't trial it at all, they just sent straight to production. In this case, it successfully achieved a fatality.

[–] torpak@discuss.tchncs.de 2 points 1 year ago

I'm literally waiting for the moment when a disproportionate ammount of Musk-critics die in car crashes.

[–] silvercove@lemdro.id 25 points 1 year ago (35 children)
load more comments (35 replies)
[–] whataboutshutup@discuss.online 13 points 1 year ago (1 children)

It seems like an obvious flaw that's pretty simple to explain. Car is learnt to operate the infromation about collisions on a set height. The opening between the wheels of a truck's trailer thus could be treated by it as a free space. It's a rare situation, but if it's confirmed and reproduceable, that, at least, raises concerns, how many other glitches would drivers learn by surprise.

load more comments (1 replies)
[–] sugartits@lemmy.world 10 points 1 year ago (2 children)

The second trial, set for early October in a Florida state court, arose out of a 2019 crash north of Miami where owner Stephen Banner’s Model 3 drove under the trailer of an 18-wheeler big rig truck that had pulled into the road, shearing off the Tesla's roof and killing Banner. Autopilot failed to brake, steer or do anything to avoid the collision, according to the lawsuit filed by Banner's wife.

Is this the guy who was literally paying no attention to the road at all and was watching a movie whilst the car was in motion?

I legit can't find information on it now as every result I can find online is word for word identical to that small snippet. Such is modern journalism.

I know people like to get a hard on with the word "autopilot", but even real pilots with real autopilot still need to "keep an eye on things" when the system is engaged. This is why we have two humans in the cockpit on those big commercial jets.

[–] ephemeral_gibbon@aussie.zone 28 points 1 year ago (11 children)

The way musk marketed it was as a "self driving" feature, not a driving assist. Yes with all current smart assists you need to be carefully watching what it's doing, but that's not what it was made out to be. Because of that I'd still say tesla is responsible.

load more comments (11 replies)
[–] Auli@lemmy.ca 3 points 1 year ago* (last edited 1 year ago)

There are also two pilots. Because they know people are people. And don't brand it a self driving and full self driving then.

[–] autotldr@lemmings.world 3 points 1 year ago

This is the best summary I could come up with:


SAN FRANCISCO, Aug 28 (Reuters) - Tesla Inc (TSLA.O) is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.

Self-driving capability is central to Tesla’s financial future, according to Musk, whose own reputation as an engineering leader is being challenged with allegations by plaintiffs in one of two lawsuits that he personally leads the group behind technology that failed.

The first, scheduled for mid-September in a California state court, is a civil lawsuit containing allegations that the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour, strike a palm tree and burst into flames, all in the span of seconds.

Banner’s attorneys, for instance, argue in a pretrial court filing that internal emails show Musk is the Autopilot team's "de facto leader".

Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the "Autopilot" and "Full Self-Driving" names.

In one deposition, former executive Christopher Moore testified there are limitations to Autopilot, saying it "is not designed to detect every possible hazard or every possible obstacle or vehicle that could be on the road," according to a transcript reviewed by Reuters.


The original article contains 986 words, the summary contains 241 words. Saved 76%. I'm a bot and I'm open source!

[–] tmRgwnM9b87eJUPq@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (3 children)

~~Although it’s far from perfect, autopilot gets into a lot less accidents per mile than drivers without autopilot.~~

~~They have some statistics here:~~ https://www.tesla.com/VehicleSafetyReport

EDIT: As pointed out by commenters in this thread, autopilot is mainly used on high ways, whereas the crash average is on all roads. Also Tesla only counts a crash if the airbag was deployed, but the numbers they compared against count every crash, including the ones without deployed airbags.

[–] silvercove@lemdro.id 12 points 1 year ago

Why should we trust any numbers that comes from Tesla?

[–] MoonlitSanguine@lemmy.one 5 points 1 year ago (6 children)

Do you have statistics not by Tesla?

load more comments (6 replies)
[–] RecallMadness@lemmy.nz 2 points 1 year ago* (last edited 1 year ago) (4 children)

And when autopilot is at fault for an accident or fatality, who should be held responsible?

Just because it’s better, shouldn’t absolutely them of responsibility when it fails.

load more comments (4 replies)
[–] tslnox@reddthat.com 3 points 1 year ago (7 children)

I can't understand how anyone is even able to let the car do something on its own. I drive old Dacia Logan and Renault Scénic, but at work we have Škoda Karoq and I can't even fully trust its beeping backing sensors or automatic handbrake. I can't imagine if the car steered, accelerated or braked without me telling it to.

[–] Cethin@lemmy.zip 3 points 1 year ago* (last edited 1 year ago) (2 children)

I think it's fine at the level where you are there and ready to take control, but you need to be paying attention still. Humans aren't flawless and we shouldn't expect our automated systems to be either. This doesn't excuse Tesla, because they've been marketing it as something it's not for a long time now. They're driver assist features, not self driving features. It can keep you in a lane and maintain speed well, but you shouldn't fully trust it. If it's better than humans at some tasks, it should be used for those regardless of if it will fail at it sometimes. People shouldn't be lied to and convinced it's more than it is though.

[–] limelight79@lemm.ee 2 points 1 year ago

I actually think that the less a driver has to do, the worse they'll be at reacting when a situation does come up.

If I'm actually driving and someone, say, runs out in front of me, I'll slam on the brakes. I've had this happen, actually - it was scary as hell because my brain froze up, but...fortunately for us and the guy, my foot still knew what to do, and we stopped in time.

But if I'm sitting in the seat, just monitoring, not actively doing something, my attention is much more likely to wander, and when that incident happens, my reaction time is likely going to be a LOT slower, because I have to "mode shift" back into operating a car, whereas I was already in that mode in the incident above. I don't think the manufacturers are adequately considering this factor.

(I recognize this might not be a perfect example with automatic brakes, but I think the point is clear.)

load more comments (1 replies)
load more comments (6 replies)
load more comments
view more: next ›