this post was submitted on 30 Aug 2023
159 points (94.9% liked)

Technology

59092 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi...::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

all 49 comments
sorted by: hot top controversial new old
[–] Thorny_Thicket@sopuli.xyz 73 points 1 year ago (4 children)

AI DRIVR made an interesting analysis about the v12 on YouTube. Apparently it's completely different from the previous versions and instead of understanding traffic rules it learns from a videos of people driving which means it does things like doesn't fully stop at stop signs and drives over the speedlimit - like people do too.

It's interesting because by strictly following traffic rules you might infact be a danger to others but then you're also breaking the law. Good example of a situation where the "right" thing to do might not be the most intuitive one though in this case it's still up for a debate.

[–] ultratiem@lemmy.ca 77 points 1 year ago (2 children)

That’s what we were all clambering for: a self driving machine that operates like a mouth breather late for work.

Elon is a masterclass of stupid.

[–] EpsilonVonVehron@lemmy.world 6 points 1 year ago (1 children)

Mush doesn’t care about laws. As mentioned on another article, he appears to be operating the phone by hand in the driver’s seat, which is both a driving violation and against Tesla’s own driver manual.

[–] ultratiem@lemmy.ca 4 points 1 year ago* (last edited 1 year ago)

Same guy who parades around in his private jet calling everyone who doesn’t return to the office amoral and selfish.

So yeah. All that tracks. The entire “it’s different because it’s me” stench wafting in.

[–] PipedLinkBot@feddit.rocks 8 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/watch?v=ZI7-Swmuo4A

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[–] AnUnusualRelic@lemmy.world 6 points 1 year ago

Autonomous cars will only work properly in areas where humans aren't allowed to drive.

[–] variaatio@sopuli.xyz 5 points 1 year ago* (last edited 1 year ago) (1 children)

It’s interesting because by strictly following traffic rules you might infact be a danger to others but by driving like humans you’re also breaking the law.

Well the others should also stop breaking the law, then things are safe again. One doesn't solve the illegal murder problem by making murder legal. If someone is danger to someone else by driving legally, then source of problem is other persons behaviour. Since legal rules don't include stuff like "be obnoxious and hindering to others".

The other drivers must drive like expecting possibly the others involved driving by the rules. Leaving enough room, incase the car in front in fact does stop at the stop sign. Since they might have to emergency stop anyway. If one isn't distant enough to leave room for stop sign stopping, one certainly doesn't have the safe distance to anticipate as they should the car in front at any moment having to do emergency stop due to developing sudden situation. One must always leave avoidance distance.

Drive by the speed limit and not little over? It is the speeding over takers fault they are speeding over taker, took a dangerous over take when they shouldn't due to being "annoyed" by someone driving by the speed limit and thus causing a crash.

There is very very few cases where driving by the rules is the cause of danger. Other drivers being fool hardy, emotional idiots is the source of danger. Fault will and should land with the fool hardy idiot.

As NTHSA said with making Tesla remove the "california stop" aka rolling the stop singing without stopping, others breaking the law don't make it legal for you. In fact said arbitrary cultural behavior, which some follow and some don't is a source of danger due to uncertainty it causes.

edit: So in long term the car is safer by following rules, since it induces others to drive legally and predictably. Specially since machines don't use human non verbal hints and so on. Thus the only sensible route for a driving machine, instead of driving human is to strictly follow traffic rules. Since it makes it a predictable player. Unlike with humans other humans have no way to culturally gauge how a "driving machine would behave", if it doesn't behave by the one publicly known precedent it could be expected to behave.... Driving by the rules to the letter. Which does include the simple rule of "if you can you must try to avoid collision, even on having right of way". No amount of "but the rules say", overrules that basic rule in the rules "every driver has obligation to try to avoid collision or minimize collision upon not being able to avoid collision." So there well be no "cyborg car bowling down a pedestrian or other car, because technically the other person was breaking the law. The car had right of way".

[–] Thorny_Thicket@sopuli.xyz 1 points 1 year ago (1 children)

I obviously don't know for sure, but at least it's conceivable that, in fact, it may be the case that erratic behavior of other drivers, caused by someone else driving slower than them, leads to a significant number of accidents every year that would not have happened had they been driving at the same speed as everyone else.

In this case, forcing the self-driving vehicle to never go over the speed limit literally means you're knowingly choosing an option that leads to more people dying instead of less.

I think there's a pretty clear moral dilemma here. I'm not claiming to know the right way forward, but I just want to point out that strictly following the rules without an exception is not always what leads to the best results. Of course, allowing self-driving cars to break the rules comes with its own issues, but this just further points to the complexity of this issue.

[–] variaatio@sopuli.xyz 1 points 1 year ago (1 children)

Tehnyt again if that follow others behavior is drive faster, that also leads to accidents. Not many with the other frustrated drivers, but with say wildlife. People not being aboe to stop in time more often dye to the increased speed and thus increased braking distance.

That is why bendy narrow roads have slower speed limit. It is function of what is the predicted reaction time, the amount of sight distance one had.

Can't cheat physics, the more speeding there is, the longer the braking distances, the more often it isn't anymore a near miss due to braking in time and instead a full on collision.

So sure one is more synch, but every is in synch with less reaction time available, when the unavoidable chaos factor raises its head. Chaos factor like wild live (who are not obligated nor obliged to follow traffic rules) or say someone bursting a tire leading to sudden change in speed and control.

[–] Thorny_Thicket@sopuli.xyz 2 points 1 year ago

When a self-driving car drives at or below the speed limit on a fast-moving highway, it can disrupt the natural flow of traffic. This can lead to a higher chance of accidents when other human drivers resort to aggressive maneuvers like tailgating, risky overtaking, or sudden lane changes. I'm not claiming that it does so for a fact, but it is conceivable, and that's the point of my argument.

Now, contrast this with a self-driving car that adjusts its speed to match the prevailing traffic conditions, even if it means slightly exceeding the speed limit. By doing so, it can blend with the surrounding traffic and reduce the chances of accidents. It's not about encouraging speeding but rather adapting to the behavior of other human drivers.

Of course, we should prioritize safety and adhere to traffic rules whenever possible. However, sometimes the safest thing to do might be temporarily going with the flow, even if it means bending the speed limit rules slightly. The paradox lies in the fact that by mimicking human behavior to a certain extent, self-driving cars can contribute to overall road safety. It's a nuanced issue, but it underscores the complexity of integrating autonomous vehicles into a world where human drivers are far from perfect. This would not be an issue if every car was driven by an competent AI and there was no human drivers.

[–] DampSquid@feddit.uk 20 points 1 year ago* (last edited 1 year ago) (1 children)

I assume this all just bullshit and lies like last time?

[–] Honytawk@lemmy.zip 7 points 1 year ago

When was it ever different from old musky?

How long before the cybertruck is released again?

[–] atfergs@lemmy.world 7 points 1 year ago

I wonder who got fired after that.

[–] EdibleFriend@lemmy.world 6 points 1 year ago (2 children)

Eh I hate the dipshit but he has a point. its not really doxxing when he literally just googled it live.

[–] silvercove@lemdro.id 36 points 1 year ago (1 children)

Is that why he has been trying to ban ElonJet?

[–] EdibleFriend@lemmy.world 19 points 1 year ago (1 children)

lol already talked about that elsewhere. yep its exactly the same thing. im not saying he isn't a hypocrite

[–] Aurenkin@sh.itjust.works 4 points 1 year ago (1 children)

Excuse me, this is the internet. You have to form a view on someone and then either agree or disagree with everything they do consistently otherwise it's illegal.

[–] EdibleFriend@lemmy.world 3 points 1 year ago

Lop yep. People are talking to me like I'm some kind of Fanboy when all I really said was maybe, this time, he didn't actually strangle a puppy.

[–] lazyvar@programming.dev 2 points 1 year ago (2 children)

Isn’t that a little bit of circular reasoning?

If I doxx someone online then it gets indexed by Google, if someone then Google’s the information it stops being doxxing?

I’d assume most doxxing isn’t done by someone who has unique firsthand knowledge (e.g. “Oh I know John, he lives on so and so road”) and instead is done by finding the information online whether via Google or a different public source.

At least in the US, where a ridiculous amount of private information is deemed “public”.

[–] timkenhan@sopuli.xyz 6 points 1 year ago (1 children)

releasing the information versus acquiring the released information are two different thing.

[–] lazyvar@programming.dev 2 points 1 year ago (1 children)

Most doxxers don't technically release the information, rather they've acquired it and point others to where they've acquired it or simply disseminate it further.

[–] indepndnt@lemmy.world 1 points 1 year ago

disseminate it further.

AKA release it.

[–] EdibleFriend@lemmy.world 5 points 1 year ago (1 children)

Not really? Because, in your scenario, Musk would have to be the person to originally post his info. He didn't even have to go drop a few bucks on spokeo or something

[–] lazyvar@programming.dev 1 points 1 year ago

That's what I'm saying. In most cases the doxxer isn't the one who originally provided the info, but rather someone who has found the information online via a Google search or something similar.

[–] skymtf@lemmy.blahaj.zone 5 points 1 year ago

near miss red light lmao

[–] Vub@lemmy.world 4 points 1 year ago (1 children)

I don’t understand much of this stuff but does this mean they (he…) threw a decade of research out the window and instead fed an AI loads of video data to start over from scratch?

[–] ours@lemmy.film 3 points 1 year ago

I'm guessing it's just more complicated than that. Training an AI model with loads of video data is how they got this far but they seem to be hitting the limits of this current process/sensors.

[–] autotldr@lemmings.world 4 points 1 year ago

This is the best summary I could come up with:


“That’s why we’ve not released this to the public yet.” (FSD is technically a beta software, though Musk has said that v12 will be the first time Tesla removes that label.)

But the moment when Musk was forced to intervene at the traffic light has already been seized upon by critics who say Tesla’s approach to autonomous driving is insufficient and reckless.

Musk has said that FSD is being tested as beta software to emphasize the need for drivers to pay attention to the road while using the driver-assist feature.

(Remember, Musk has banned the @ElonJet account that tracks his private jet from X/Twitter, claiming it was a “direct personal safety risk” to him.)

The broader context here is that the federal government’s two-year investigation into Tesla’s highway driver-assist feature, Autopilot, is nearing its end, which may have prompted Musk to post the video as provocation.

The government could force a recall of Autopilot and, by extension, FSD, which could affect Tesla’s valuation, much of which hinges on the company’s promise that it will offer full autonomy to its customers in the near future.


The original article contains 656 words, the summary contains 184 words. Saved 72%. I'm a bot and I'm open source!