this post was submitted on 21 Aug 2023
448 points (94.3% liked)

Technology

59118 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla knew Autopilot caused death, but didn't fix it::Software's alleged inability to handle cross traffic central to court battle after two road deaths

top 50 comments
sorted by: hot top controversial new old
[–] jeffw@lemmy.world 39 points 1 year ago (1 children)
[–] remotelove@lemmy.ca 13 points 1 year ago
[–] dub@lemmy.world 26 points 1 year ago (2 children)

A times B times C equals X… I am jacks something something something

[–] tool@lemmy.world 23 points 1 year ago (1 children)

A times B times C equals X… I am jacks something something something

Narrator: A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don't do one.

Woman on Plane: Are there a lot of these kinds of accidents?

Narrator: You wouldn't believe.

Woman on Plane: Which car company do you work for?

Narrator: A major one.

[–] droans@lemmy.world 6 points 1 year ago

When you're selling a million cars, it's guaranteed that some of them will have a missed defect, no matter how good your QC is.

That's why you have agencies like the NHTSA. You need someone who can decide at what point the issue is a major defect that constitutes a recall.

[–] WhyYesZoidberg@lemmy.world 6 points 1 year ago (1 children)

”One of the major ones”

load more comments (1 replies)
[–] Lucidlethargy@sh.itjust.works 22 points 1 year ago (1 children)

Didn't, or couldn't? Tesla uses a vastly inferior technology to run their "automated" driving protocols. It's a hardware problem first and foremost.

It's like trying to drive a car with a 720p resolution camera mounted on the license plate holder versus a 4k monitor on the top of the car. That's not a perfect analogy, but it's close enough for those not aware of how cheap these cars and their tech really is.

[–] CmdrShepard@lemmy.one 16 points 1 year ago (1 children)

It remains to be seen what hardware is required for autonomous driving as no company has a fully functioning system, so there is no baseline to compare to. Cruise (the "4k monitor" in your anaology) just had to cut their fleet of geofenced vehicles after back to back crashes involving emergency vehicles along with blocking traffic and attempting to run over things like fire hoses.

load more comments (1 replies)
[–] SouthEndSunset@lemm.ee 19 points 1 year ago (1 children)

Of course not. Fixing it would cost money.

https://en.wikipedia.org/wiki/Ford_Pinto

The other day I said on here that self driving cars arent ready and got downvoted to fuck. This is why I said it.

[–] CmdrShepard@lemmy.one 7 points 1 year ago* (last edited 1 year ago)

https://www.wardsauto.com/blog/my-somewhat-begrudging-apology-ford-pinto

https://www.latimes.com/archives/la-xpm-1993-02-10-mn-1335-story.html

Two examples of the media creating a frenzy that wound up being proven completely false later.

In OP's case, both of these drivers failed to see a semi crossing the road right in front of them even though they were sitting in the driver's seat with their hands on the wheel. This technology certainly needs improvement, but this is like blaming every auto manufacturer when someone crashes their car while texting on their phone.

[–] stackcheese@lemmy.world 17 points 1 year ago* (last edited 1 year ago)

but im saving the planet and making sure elon gets a cut of my money

[–] skymtf@pricefield.org 10 points 1 year ago (2 children)

I feel like some people are such Tesla fanboys that they will argue when I say Tesla FSD is not real and never has been.

[–] _stranger_@lemmy.world 7 points 1 year ago (2 children)

Probably because calling something "not real" is infuriatingly vague.

Feel free to expand on your position, I actually do want to know what "not real" means in this context.

If you mean, from a semantics perspective, that FULL means it should be a completely independent and autonomous system, bravo, you've made and won the most uninteresting form of that argument.

[–] Liz@midwest.social 8 points 1 year ago (2 children)

I mean, don't call your service something it's not? Words should have meaning? Tesla's Autopilot is very impressive, but it's not fully independent, and that's okay. Honestly if it had an accurate name people wouldn't attack it so much. Other manufacturers are gaining similar capabilities but no one is complaining that their cars aren't perfect either.

[–] CmdrShepard@lemmy.one 2 points 1 year ago (1 children)

Autopilot is an accurate name as it takes over the mundane portions of the task. Airline pilots don't just hit a green button on the dash that says "fly" and the autopilot takes over until they hit a red "land" button. You can argue that people have a misconception about the word but the word itself is correct.

[–] Liz@midwest.social 3 points 1 year ago

It's my understanding that they actually could do that at this point, commerical flying is a controlled and predictable environment compared to driving on the road. Ten years ago I was hearing anecdotes from pilots saying the only thing they do is takeoff and land and even then the computer could handle it just fine if they let it. Maybe the autopilot in a Cesna sucks, but it's pretty much fully automated in an Airbus.

load more comments (1 replies)
[–] renohren@partizle.com 1 points 1 year ago* (last edited 1 year ago) (2 children)

Yeah they should have called it level 2 autonomous driving, like most other mass market car makers do (except Mercedes which have level 3 on the roads).

load more comments (2 replies)
[–] Ocelot@lemmies.world 2 points 1 year ago (4 children)

I have nearly 20k miles on tesla's FSD platform, it works amazingly well for something thats "not real". There are countless youtube channels out there where people will mount a gopro in their car and go for a drive. Some of them like AIDRIVR and Whole Mars Catalog pretty much never take over control of the car without any drama. Especially in the past ~6 months or so of development it has been amazing.

load more comments (4 replies)
[–] gamer@lemm.ee 9 points 1 year ago (3 children)

I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It's a question that doesn't have a right answer, but it must be answered by anybody implementing a self driving car.

I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.

[–] Liz@midwest.social 7 points 1 year ago

At the very least, they would prioritize the driver, because the driver is likely to buy another Tesla in the future if they do.

[–] Ocelot@lemmies.world 5 points 1 year ago (1 children)

Meanwhile hundreds of people are killed in auto accidents every single day in the US. Even if a self driving car is 1000x safer than a human driver there will still be accidents as long as other humans are also sharing the same road.

[–] Oderus@lemmy.world 5 points 1 year ago (2 children)

When a human is found to be at fault, you can punish them.

With automated driving, who's to punish? The company? Great. They pay a small fine and keep making millions while your loved one is gone and you get no justice.

[–] CmdrShepard@lemmy.one 7 points 1 year ago

People generally aren't punished for an accident unless they did it intentionally or negligently. The better and more prevalent these systems get, the fewer the families with lost loved ones. Are you really arguing that this is a bad thing because it isn't absolutely perfect and you can't take vengeance on it?

load more comments (1 replies)
[–] CmdrShepard@lemmy.one 3 points 1 year ago

I think the whole premise is flawed because the car would have had to have had numerous failures before ever reaching a point where it would need to make this decision. This applies to humans as we have free will. A computer does not.

[–] fne8w2ah@lemmy.world 4 points 1 year ago (7 children)

Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.

load more comments (7 replies)
[–] Nogami@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

Calling it Autopilot was always a marketing decision. It's a driver assistance feature, nothing more. When used "as intended", it works great. I drove for 14 hours during a road trip using AP and arrived not dead tired and still alert. That's awesome, and would never have happened in a conventional car.

I have the "FSD" beta right now. It has potential, but I still always keep a hand on the wheel and am in control of my car.

At the end of the day, if the car makes a poor choice because of the automation, I'm still responsible as the driver, and I don't want an accident, injury, or death on my conscience.

[–] silvercove@lemdro.id 2 points 1 year ago (1 children)

Tesla is producing advertisement videos (already in 2016) that said "our cars drive themselves, you don't need a driver". What you said is worthless when Tesla itself is marketing their cars like that.

load more comments (1 replies)
load more comments
view more: next ›