this post was submitted on 22 Nov 2023
352 points (97.8% liked)

Technology

58108 readers
4981 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami

you are viewing a single comment's thread
view the rest of the comments
[–] fmstrat@lemmy.nowsci.com 7 points 10 months ago* (last edited 10 months ago) (2 children)

With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).

Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).

[–] lloram239@feddit.de 7 points 10 months ago* (last edited 10 months ago) (2 children)

depth is reliable

What if one of them is dirty? What if you are driving with the sun right in front of you? What on a foggy winter day? The big problem here isn't even what the cameras are or aren't capable of, but that there is little to no information on all the situations Tesla's autopilot will fail in. We are only really learning that one deadly accident at a time. The documentation of the autopilots capabilities is extremely lacking, it's little more than "trust me bro" and "keep your hands on the wheel".

The fact that it can't even handle railroad crossing and thinks trains are a series of trucks and buses that blink in and out of existence and randomly change directions does not make me wanna blindly trust it.

[–] PipedLinkBot@feddit.rocks 1 points 10 months ago

Here is an alternative Piped link(s):

railroad crossing

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] fmstrat@lemmy.nowsci.com 0 points 10 months ago

Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It's one place a camera is more versatile than our eyes.

All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that's not commonly used by the average person, and does not have claims of commercial FSD.

[–] MeanEYE@lemmy.world 3 points 10 months ago (1 children)

depth is reliable

No it's not. World is filled with optical illusions that even our powerful brains can't process and yet you expect two web cams to do. And depth is not the only thing that's needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it's a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it's a motorcyle... first because of sound second because our brain is better at reasoning. And we'd avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.

[–] PipedLinkBot@feddit.rocks 1 points 10 months ago

Here is an alternative Piped link(s):

Case in point

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.