this post was submitted on 03 Sep 2023
155 points (100.0% liked)

Technology

37598 readers
198 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] HappyMeatbag@beehaw.org 55 points 1 year ago (2 children)

Those damn things are not ready to be used on public roads. Allowing them is one of the more prominent examples of corruption that we’ve seen recently.

[–] Thorny_Thicket@sopuli.xyz 33 points 1 year ago* (last edited 1 year ago) (14 children)

Statistically they're still less prone to accidents than human drivers.

I never quite undestood why so many people seem to be against autonomous vehicles. Especially on Lemmy. It's unreasonable to demand perfection before any of these is used on the public roads. In my view the bar to reach is human level driving and after that it seems quite obvious that from safety's point of view it's the better choice.

[–] evilviper@beehaw.org 46 points 1 year ago* (last edited 1 year ago) (3 children)

This is just such a bad take, and it's so disappointing to see it parroted all over the web. So many things are just completely inaccurate about these "statistics", and it's probably why it "seems" so many are against autonomous vehicles.

  1. These are self-reported statistics coming from the very company(s) that have extremely vested interests in making themselves look good.
  2. These statistics are for vehicles that are currently being used in an extremely small (and geo-fenced) location(s) picked for their ability to be the easiest to navigate while being able to say "hey we totally work in a big city with lots of people".
  • These cars don't even go onto highways or areas where accidents are more likely.
  • These cars drive so defensively they literally shut down so as to avoid causing any accidents (hey, who cares if we block traffic and cause jams because we get to juice our numbers).
  1. They always use total human driven miles which are a complete oranges to apples comparison: Their miles aren't being driven
  • In bad weather
  • On dangerous, windy, old, unpaved, or otherwise poor road conditions
  • In rural areas where there are deer/etc that wander into the road and cause accidents
  1. They also don't adjust or take any median numbers as I'm not interested in them driving better than the "average" driver when that includes DUIs, crashes caused by neglect or improper maintenance, reckless drivers, elderly drivers, or the fast and furious types crashing their vehicle on some hill climb driving course.
  2. And that's all just off the top of my head.

So no, I would absolutely not say they are "less prone to accidents than human drivers". And that's just the statistics, to so nothing about the legality that will come up. Especially given just how adverse companies seem to be to admit fault for anything.

[–] Kleinbonum@feddit.de 13 points 1 year ago (6 children)

These cars don't even go onto highways or areas where accidents are more likely.

Accidents are less likely on highways. Most accidents occur in urban settings. Most deadly accidents occur outside of cities, off-highway.

load more comments (6 replies)
[–] abhibeckert@beehaw.org 6 points 1 year ago* (last edited 1 year ago)

Avoiding dangerous scenarios is the definition of driving safely.

This technology is still an area under active development and nobody (not even Elon!) is claiming this stuff is ready to replace a human in every possible scenario. Are you actually suggesting they should be testing the cars in scenarios that they know wouldn't be safe with the current technology? Why the fuck would they do that?

So no, I would absolutely not say they are “less prone to accidents than human drivers”.

OK... if you won't accept the company's reported data - who's data will you accept? Do you have a more reliable source that contradicts what the companies themselves have published?

to say nothing about the legality that will come up

No that's a non issue. When a human driver runs over a pedestrian/etc and causes a serious injury, if it's a civilised country and a sensible driver, then an insurance company will pay the bill. This happens about a million times a week worldwide and insurance is a well established system that people are, for the most part, happy with.

Autonomous vehicles are also covered by insurance. In fact it's another area where they're better than humans - because humans frequently fail to pay their insurance bill or even deliberately drive after they have been ordered by a judge not to drive (which obviously voids their insurance policy).

There have been debates over who will pay the insurance premium, but that seems pretty silly to me. Obviously the human who ordered the car to drive them somewhere will have to pay for all costs involved in the drive. And part of that will be insurance.

[–] Thorny_Thicket@sopuli.xyz 5 points 1 year ago* (last edited 1 year ago) (2 children)

Well hey - atleast I provided some statistics to back me up. That's not the case with the people refuting those stats.

load more comments (2 replies)
[–] lloram239@feddit.de 24 points 1 year ago* (last edited 1 year ago) (2 children)

I never quite undestood why so many people seem to be against autonomous vehicles.

People aren't against autonomous vehicles, but against them getting let lose on public roads with zero checks or transparency. We basically learn what they are and aren't capable of one crash at a time, when all of that should have been figured out years ago in the lab.

The fact that they can put a safety driver in them to absorb any blame is another scandal.

Statistically they’re still less prone to accidents than human drivers.

That's only due to them not driving in the same condition as humans. Let them drive in fog and suddenly they can't even see clearly visible emergency vehicles.

None of this would be a problem if those companies would be transparent about what those vehicles are capable of and how they react in unusual situations. All of which they should have tested a million times over in simulation already.

[–] Thorny_Thicket@sopuli.xyz 9 points 1 year ago (12 children)

With Tesla the complaint is that the statistics are almost all highway miles so it doesn't represent the most challenging conditions which is driving in the city. Cruise then exclusively drives in a city and yet this isn't good enough either. The AV-sceptics are really hard to please..

You'll always be able to find individual incidents where these systems fail. They're never going to be foolproof and the more of them that are out there the more news like this you're going to see. If we reported about human-caused crashes with the same enthusiasm that would be all the news you're hearing from then on and letting humans drive would seem like the most scandalous thing imaginable.

load more comments (12 replies)
[–] abhibeckert@beehaw.org 6 points 1 year ago* (last edited 1 year ago) (1 children)

Let them drive in fog and suddenly they can’t even see clearly visible emergency vehicles.

That article you linked isn't about self driving car. It's about Tesla "autopilot" which constantly checks if a human is actively holding onto the steering wheel and depends on the human checking the road ahead for hazards so they can take over instantly. If the human sees flashing lights they are supposed to do so.

The fully autonomous cars that don't need a human behind the wheel have much better sensors which can see through fog.

[–] lloram239@feddit.de 4 points 1 year ago (1 children)

That article you linked isn’t about self driving car.

Just because Tesla is worse than others doesn't make it not self-driving. The "wiggle the steering wheel" feature is little more than a way to shift blame to driver instead of the crappy self-driving software.

so they can take over instantly.

Humans fundamentally can't do that. If you sit a human in a self driving car doing nothing for hours, they won't be able to react in a split section when it is needed. Sharing driving in that way does not work.

The fully autonomous cars that don’t need a human behind the wheel have much better sensors which can see through fog.

Is anybody actively testing them in bad weather conditions? Or are we just blindly trusting claims from the manufacturers yet again?

[–] abhibeckert@beehaw.org 5 points 1 year ago* (last edited 1 year ago)

Just because Tesla is worse than others doesn’t make it not self-driving.

The fact that Tesla requires a human driver to take over constantly makes it not self-driving.

so they can take over instantly.

Humans fundamentally can’t do that. If you sit a human in a self driving car doing nothing for hours, they won’t be able to react in a split section when it is needed.

The Human isn't supposed to be "doing nothing". The human is supposed to be driving the car. Autopilot is simply keeping the car in the correct lane for you, and also adjusting the speed to match the car ahead.

Tesla's system won't even stop at an intersection if you need to give way (for example, a stop sign. Or a red traffic light). There's plenty of stuff the human needs to be doing other than turning the steering wheel. If there is a vehicle stopped in the middle of the road Tesla's system will drive straight into it at full speed without even touching the brakes. That's not something that "might happen" it's something that will happen, and has happened, any time a stationary vehicle is parked on the road. It can detect the car ahead of you slowing down. It cannot detect a stopped vehicle.

They've promised to ship a more capable system "soon" for over a decade. I don't see any evidence that it's actually close to shipping though. The autonomous systems by other manufacturers are significantly more advanced. They shouldn't be compared to Tesla at all.

Is anybody actively testing them in bad weather conditions?

Yes. Tens of millions of testing and they pay especially close attention to any situations where the sensors could potentially fail. Waymo says their biggest challenge is mud (splashed up from other cars) covering the sensors. But the cars are able to detect this, and the mud can be wiped off. it's a solvable problem.

Unlike Tesla, most of the other manufacturers consider this a research project and are focusing all of their efforts on making the technology better/safer/etc. They're not making empty promises and they're being cautious.

On top of the millions of miles of actual testing, they also record all the sensor data for those miles and use it to run updated versions of the algorithm in exactly the same scenario. So the millions of miles have, in fact, been driven thousands and thousands of times over for each iteration of their software.

[–] dsemy@lemm.ee 19 points 1 year ago (7 children)

You don’t understand why people on Lemmy, an alternative platform not controlled by corporations, might not want to get in a car literally controlled by a corporation?

I can easily see a future where your car locks you in and drives you to a police station if you do something “bad”.

As to their safety, I don’t think there are enough AVs to really judge this yet; of course Cruise’s website will claim Cruise AVs cause less accidents.

[–] IWantToFuckSpez@kbin.social 6 points 1 year ago (2 children)

I can imagine in the future there will be grid locks in front of the police station with AV cars full of black people when the cops send out an ABP with the description of a black suspect.

We’ve seen plenty of racist AI programs in the past because the programmers, intentionally or not, added their own bias into the training data.

[–] lol3droflxp@kbin.social 4 points 1 year ago

Any dataset sourced from human activity (eg internet text as in Chat GPT) will always contain the current societal bias.

load more comments (1 replies)
[–] Thorny_Thicket@sopuli.xyz 5 points 1 year ago (4 children)

You're putting words to my mouth. I wasn't talking about people on Lemmy not wanting to get into one of these vehicles.

The people here don't seem to want anyone getting into these vehicles. Many here are advocating for all-out ban on self-driving cars and demand that they're polished to near perfection on closed roads before being allowed for public use even when the little statistics we already have mostly seem to indicate these are at worst as good as human drivers.

If it's about Teslas the complain often is the lack of LiDAR and radars and when it's about Cruise which has both it's then apparently about corruption. In both cases the reaction tends to be mostly emotional and that's why every time one provides statistics to back up the claims about safety it just gets called marketing bullshit.

load more comments (4 replies)
load more comments (5 replies)
[–] rikudou@lemmings.world 9 points 1 year ago (16 children)

Fine by me, as long as the companies making the cars take all responsibility for accidents. Which, you know, the human drivers do.

But the car companies want to sell you their shitty autonomous driving software and make you be responsible.

If they don't trust it enough, why should I?

load more comments (16 replies)
[–] Turun@feddit.de 7 points 1 year ago (2 children)

I'm not gonna join in the discussion, but if you cite numbers, please don't link to the advertising website of the company itself. They have a strong interest in cherry picking the data to make positive claims.

load more comments (2 replies)
[–] baggins@beehaw.org 7 points 1 year ago (3 children)

They can't come quick enough for me. I can go to work after a night out without fear I might still be over the limit. I won't have to drive my wife everywhere. Old people will not be prisoners in their own homes. No more nobheads driving about with exhausts that sound like a shoot out with the cops. No more aresholes speeding about and cutting you up. No more hit and runs. Traffic accident numbers falling through the floor. In fact it could even get to a point where the only accidents are the fault of pedestrians/cyclists not looking where they are going.

[–] nous@programming.dev 9 points 1 year ago

All of these are solved by better public transport/safe bike routes and more walkable city designs. All of which is we can do now, not rely on some new shiny tech so that we can keep car companies profits up.

[–] Thorny_Thicket@sopuli.xyz 4 points 1 year ago (2 children)

The possibilities really are endless.

When the light turns green the entire row of cars can start moving at the same time like on motor sports. Perhaps you don't even need traffic lights because they can all just drive to the intersection at the same time and just keep barely missing eachother but never crash due to the superior reaction times and processing speeds of computer. You could also let your car go taxi other people around when you don't need it.

[–] baggins@beehaw.org 4 points 1 year ago (4 children)

I think you might need lights for pedestrians at crossings.

I did wonder if ambulances would need sirens but again, pedestrians!

load more comments (4 replies)

What if we tied that entire row of cars together as one unit so we could save cost on putting high end computers in each car? Give them their own dedicated lane because we will never have 100% fully autonomous cars on the road unless we make human drivers illegal.

I'll call my invention a train.

load more comments (1 replies)
[–] upstream@beehaw.org 6 points 1 year ago (7 children)

I saw a video years ago discussing this topic.

How good is “good enough” for self-driving cars?

The bar is much higher than it is for human drivers because we downplay our own shortcomings and think that we have less risk than the average driver.

Humans can be good drivers, sure. But we have serious attention deficits. This means it doesn’t take a big distraction before we blow a red light or fail to observe a pedestrian.

Hell, lot of humans fail to observe and yield to emergency vehicles as well.

But none of that is newsworthy, but an autonomous vehicle failing to yield is.

My personal opinion is that the Cruise vehicles are as ready for operational use as Teslas FSD, ie. should not be allowed.

Obviously corporations will push to be allowed so they can start making money, but this is probably also the biggest threat to a self-driving future.

Regulated so strongly that humans end up being the ones in the driver seat for another few decades - with the cost in human lives which that involves.

load more comments (7 replies)
load more comments (7 replies)
[–] gelberhut@lemdro.id 9 points 1 year ago* (last edited 1 year ago) (3 children)

Are you talking about AVs or about humandrivers, which drive drunk, been overtired, after a bed night, emotionally, texting during driving etc?

load more comments (2 replies)
[–] sub_@beehaw.org 28 points 1 year ago* (last edited 1 year ago) (1 children)

I believe from what I read is that some of these driverless car companies in the US are releasing their fleet, flooding the street 24/7. Some of them will take up parking places, cause traffic jam, or just stall in the middle of the road.

Maybe it's different in the Europe, where there's stricter regulation, since from the comments here, many who are okay with driverless car are mostly from European countries. Unless if you own stock in those companies, then there's incentive caused bias.

Just like how drugs need to go on multiple clinical trials before going on the mass market, I believe that if you want driverless vehicles, a lot of testing is needed.

But this is not testing / gathering data phase, Cruise has 300 cars at night, 100 during the day in SF, while Waymo has around 250 cars. Again, this is not testing phase, there's no driver to safeguard in case things go wrong, these are actual driverless taxi that charges people.

The main rationale of these companies is not to bring a safer environment with driverless cars, the main rationale is how to get rid of gig workers that causes problems to Uber or Lyft, problems such as demanding living wage, proper employment status, unions, etc.

If you want to look at a better approach, maybe look at how Singapore is doing it

  • it's operated by SMRT and SBS bus, which are regulated and owned by government
  • it's self driving bus
  • "drivers will remain essential to the operation of autonomous vehicles even when these do take off, although their job scope will change"

So if you wanna support, maybe don't support what Cruise is doing, but more of what Singapore is doing

  • it's still highly regulated
  • it's a bus, it's a public transportation, so it still helps in tackling climate issues.
  • it's not being used to fire workers,
  • there's still failsafe, the drivers are standby, in case the bus goes haywire
load more comments (1 replies)
[–] jon@lemmy.tf 20 points 1 year ago (1 children)

Maybe don't allow autonomous cars on public streets then? The tech is nowhere near ready for prime time.

[–] abhibeckert@beehaw.org 18 points 1 year ago* (last edited 1 year ago) (1 children)

We should ban police cars too - because allegedly an empty police car was also blocking the ambulance.

The AV spokesperson said they reviewed the footage and found there was room to pass their vehicle safely and another ambulance and other cars did so.

[–] ryan@the.coolest.zone 17 points 1 year ago (1 children)

When these things were originally being tested, at least the Waymo ones I'm familiar with, there was a driver who could manually override in case of issues. Honestly, if these things still have issues with emergency situations (and other unexpected situations), they absolutely still need a driver with the ability to manually override the car. That way, they can still test the self-driving function while being able to actually maneuver the car out of the way of things like this.

[–] Sightline@lemmy.ml 8 points 1 year ago

Don't worry, they'll continue to fail upwards.

[–] kitonthenet@kbin.social 17 points 1 year ago

These people never should’ve been allowed to beta test with our lives when no one approved it

[–] EquipLordBritish@beehaw.org 9 points 1 year ago (1 children)

Two autonomous Cruise vehicles and an empty San Francisco police vehicle were blocking the only exits from the scene, according to one of the reports, forcing the ambulance to wait while first responders attempted to manually move the Cruise vehicles or** locate an officer who could move the police car**.

So, in conjunction with a cop car, the road was blocked. I'd love to see an actual picture or diagram of the blockage.

[–] jarfil@beehaw.org 5 points 1 year ago* (last edited 1 year ago)

These AVs are programmed to give high priority to police cars, ambulances, read works, and what not. They're also happy to interprete what they see in the strictest way possible.

IIRC, there was a YouTube video of one of them going crazy because of a traffic cone... then running away from the operator when they tried to override and correct what it was doing.

It could be as little as cops leaving the car "somewhat" blocking the normal flow of traffic, then the Cruise cars strictly obeying "pull over and wait", while someone with more common sense might've reversed, gone onto the curb, or whatever.

Then again:

Cruise spokesperson Tiffany Testo countered that one of the cars cleared the scene and that traffic to the right of it remained unblocked. “The ambulance behind the AV had a clear path to pass the AV as other vehicles, including another ambulance, proceeded to do,”

...it could've been the "blocked" ambulance's drivers who were on autopilot?

Seems like not enough data to draw a conclusion.

[–] aeternum@kbin.social 9 points 1 year ago (1 children)

I thought this meant tom cruise lol.

load more comments (1 replies)
[–] cicapocok@lemm.ee 8 points 1 year ago

Obviously it is a sad story for the deceased and it's family but according to the cruise spoke person there was supposed to be enough space so the emergency car could pass. And later the article mentioned there were 55 more situations where these cars caused problems. Well there are car accidents everywhere in the word every day because of careless drivers so this is kinda common. So I really don't think banning these cars should be an answer, but to keep improving them.

[–] CanadaPlus@lemmy.sdf.org 7 points 1 year ago (1 children)

I don't get it, why isn't there an option for a Cruise employee or a first responder to just take control of the thing when it gets stuck?

[–] abhibeckert@beehaw.org 11 points 1 year ago* (last edited 1 year ago) (2 children)

Drive to the right edge of the road and stop until the emergency vehicle(s) have passed

That is a direct quote from the California DMV and from the sounds of it that's exactly what the autonomous car did.

The right answer, in my opinion, is to allow the first responders to take control of the car. This wasn't just a lone ambulance that happened upon a stationary car. It was a major crash (where a human driven car ran over a pedestrian) with a road that was blocked by emergency vehicles. A whole bunch of cars, not just autonomous ones, were stopped in the middle of the road waiting for the emergency to be over so they could continue on their way. Not sure why only this one car is getting all the blame.

[–] CanadaPlus@lemmy.sdf.org 10 points 1 year ago* (last edited 1 year ago)

I just actually bothered to read the article, and it sounds like it was an empty police car blocking the way between two Cruise cars that had pulled over leaving a space, and there in fact was a way to manually move them but it took critical time.

These cars get stuck all the time and are a major local controversy, so I'm guessing this was the click-baitiest headline they could go with. "Police officer carelessly gets in the way of paramedics" just doesn't have the same ring.

[–] amju_wolf@pawb.social 9 points 1 year ago

Not sure why only this one car is getting all the blame.

Because it generates clicks.

load more comments
view more: next ›