this post was submitted on 03 Sep 2023
140 points (86.5% liked)

Technology

59135 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] skymtf@lemmy.blahaj.zone 61 points 1 year ago (7 children)

I feel like the NTSB need to draft a min spec for self driving cars and a testing course that involves some of the worst circtimstances to get approved. I feel like all self driving cars should have to have lidar, and other sensors. Computer vision really isn't working out.

[–] echo64@lemmy.world 42 points 1 year ago (4 children)

You build a benchmark and tesla will train on that benchmark, says nothing about real world use but gets them signed off.

But yes western society is currently in a hellscape of refusing to do even basic regulation of any new technology so it'll probably be a good 20 years of murder robots on the streets before anything gets written down.

[–] FoxBJK@midwest.social 23 points 1 year ago

By “western society” do you mean the US? Because the EU doesn’t seem to have any qualms about regulating new technologies. That seems to be a uniquely American thing.

[–] ayaya@lemdro.id 14 points 1 year ago* (last edited 1 year ago)

To be fair we already have giant metal murder boxes zooming around on the streets. If AI kills even a single person everyone flips out even though over 40,000 people die every year in the US from car accidents. And that is just the deaths, not including injuries. Yet I don't really see anyone calling for more regulations on driving tests for humans.

People want AI to somehow be perfect when in reality as long as AI is even 1% better than humans that's saving over 400 lives per year. AI doesn't get sleepy, distracted, drunk, etc. so it probably already is at least 1% better in most situations. Humans are horrible drivers.

But yes western society is currently in a hellscape of refusing to do even basic regulation

US regulations are only written in blood or money. the united states was built on the backs of slaves, and then wage-slaves. literal graveyards filled with workers.

im not disagreeing with you, i just found this comically disparate to history... ie, its always been a regulation hellscape.

[–] NeoNachtwaechter@lemmy.world 3 points 1 year ago

But yes western society is currently in a hellscape of refusing to do even basic regulation

Only the Usamerican country.

We Europeans are scratching our heads already for very long: why are they letting these guys do just everything they want?

[–] SuperSleuth@lemm.ee 7 points 1 year ago (5 children)

Should a self-driving car face more rigorous tests than actual human drivers? Honest question.

[–] optissima@lemmy.world 16 points 1 year ago (1 children)
[–] stopthatgirl7@kbin.social 7 points 1 year ago

Yes, because when there’s an accident with a person driving, you usually know exactly who is legally to blame in an accident. With self-driving, if the car accidentally hits and kills someone, who do you charge for it? There’s no one person you can point to for responsibility for if something goes wrong, like you can for a person responsible for an accident.

[–] FoxBJK@midwest.social 10 points 1 year ago

Human drivers should be facing more rigorous testing regardless. It’s horrifically easy to get a license… and then they never test you again for the rest of your life. That’s just insane when you think about it. My test was in 2002. Feels like I should have to retake it at some point.

[–] IphtashuFitz@lemmy.world 9 points 1 year ago (1 children)

Yes. A human brain can handle edge cases it’s never encountered before. Can a self driving car?

  • Ever stop at a red light only to have a police officer wave you through?

  • Ever encounter a car driving the wrong way down a one way street?

  • Ever come across a flooded out stretch of road? (if the road has no lines and the water is still it can be very deceptive looking)

These are a tiny number of things I’ve encountered over the past few years. I’m sure plenty of other drivers can provide other good examples. I’d want to know how a self driving car would handle itself in situations like these.

[–] TopShelfVanilla@sh.itjust.works 1 points 1 year ago (1 children)

How will the bot car handle itself out in the country? Dirt roads? Deer? Roadblock checkpoints full of bored, mean spirited cops.

[–] NeoNachtwaechter@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

How will the bot car handle itself out in the country? Dirt roads?

They don't go there. They have their limits. Simple as that.

But when the police has ordered them there (for example, the good road must be emptied because of an emergency) then the trouble starts... now imagine not just one or two, but hundreds of them.

[–] snooggums@kbin.social 5 points 1 year ago

Yes because each person must learn on their own and have limited experience relative to the general public as a whole.

Self driving cars can 'learn' from all self driving cars and don't get tired, forget, or anything like that. While they shouldn't be held to perfection, they should absolutely be held to a higher standard than a human.

[–] nxfsi@lemmy.world 2 points 1 year ago (2 children)

Only Tesla self driving cars need to have more rigorous tests. Other brands are fine as it is because they have lidar.

[–] skymtf@lemmy.blahaj.zone -1 points 1 year ago

I feel like all them do, have you seen wayze nearly getting black people killed cause it didn't stop for s cop. And it can't recognize construction zones.

[–] sky@codesink.io -1 points 1 year ago

Five LiDAR sensors hasn't stopped Cruise from running into a bus, multiple cars, and a fire truck. Maybe self-driving is a myth?

Maybe we should just build buses and trains and pay people good salaries to operate them??

[–] Cheers@sh.itjust.works 3 points 1 year ago

Throw I some pot holes and child pedestrian crossing the street, etc and they'd even come out with a powerful marketing ad.

[–] markr@lemmy.world 3 points 1 year ago

Everyone would build to pass the test track. This does get at the problem though: the permutations of scenarios an L5 system has to correctly process is a huge number. Trying to build a system that can do that appears to be beyond anyone’s av system right now. This is why the most advanced deployments are all geofenced. That way at least the traffic signs and signals, lane markings, etc all understood and tested. Even then ‘shit happens’. Untested scenarios still occur. Also the maps are always out of date.

The problem really requires AGI, and nobody has one of those, or if they do it’s a secret.

[–] Chickenstalker@lemmy.world 2 points 1 year ago

AND have triplicate back up system that runs in parallel.

[–] soEZ@lemmy.world 2 points 1 year ago

It's not really a sensor issue, as much as having software that can interpret the sensor data and act on it. Cameras and lidar effectively provide same thing, distance to objects in 2d/3d. But u need software to process that data and identify where the road is, where little jonny is, and what to do...arguably, the distance measuring problem has been solved for a while with lidar or with cameras, it's object identification and reaction to that info that's not solved. You can't really solve it with traditional if/else programming, while AI gives you only a probability of what something is or what action to do...so the problem is hard.

But ntsb/dmv whatever needs to come up with a way to test and classify autonomous driving software...probably doing real world test and identifying edge cases where it fails.

[–] DarthBueller@lemmy.world -1 points 1 year ago* (last edited 1 year ago)

bUT thAt WOuLD StiFle proDucT iNnoVatIoN!!!