this post was submitted on 16 Mar 2025
1288 points (99.4% liked)
Technology
66601 readers
6860 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The actual wall is way more convincing though.
still, this should be something the car ought to take into account. What if there's a glass in the way?
Glass would be very interesting, might actually confuse lidar also.
That might have been an even „simpler“ test.
Yes, but Styrofoam probably damages the car less than shards of glass.
Glass is far more likely to cause injuries to the driver or the people around the set, just from being heavier material than styrofoam.
Yes, I think a human driver who isn't half asleep would notice that something is weird, and would at least slow down.
A camera will show it as being more convincing than it is. It would be way more obvious in real life when seen with two eyes. These kinds of murals are only convincing from one specific point.
…and clearly this photo wasn’t the point. In fact, it looks like a straight road from one of the camera angles he chooses later, not afaict from the pov of the car
That's true, but it's still way more understandable that a car without lidar would be fooled by it. And there is no way you would ever come into such a situation, whereas the image in the thumbnail, could actually happen. That's why it's so misleading, can people not see that?
I absolutely hate Elon Musk and support boycott of Tesla and Starlink, but this is a bit too misleading even with that in mind.
So, your comment got me thinking... surely, in a big country like the US of A, this mural must actually exist already, right?
Of course it does. It is an art piece in Columbia, S.C: https://img.atlasobscura.com/90srIbBi-XX-H9u6i_RykKIinRXlpclCHtk-QPSHixk/rt:fit/w:1200/q:80/sm:1/scp:1/ar:1/aHR0cHM6Ly9hdGxh/cy1kZXYuczMuYW1h/em9uYXdzLmNvbS91/cGxvYWRzL3BsYWNl/X2ltYWdlcy85ZTUw/M2ZkZDAxZjVhN2Rm/NmVfOTIyNjQ4NjQ0/OF80YWVhNzFkZjY0/X3ouanBn.webp
A full article about it: https://www.atlasobscura.com/places/tunnelvision
How would Tesla FSD react to Tunnelvision, I wonder? How would Tesla FSD react to an overturned semi truck with a realistic depiction of a highway on it? JK, Tesla FSD crashes directly into overturned semis even without the image depiction issue.
I don't think the test is misleading. It's puffed up for entertainment purposes, but in being puffed up, it draws attention to an important drawback of optical-only self-driving cars, which is otherwise a difficult and arcane topic to draw everyday people's attention to.
Good find, I must say I'm surprised that's legal, but it's probably more obvious in reality, and it has the sun which is probably also pretty obvious to a human.
But it might fool the Tesla?
Regarding the semi video: WTF?
But I've said for years that Tesla cars aren't safe for roads. And that's not just the FSD, they are inherently unsafe in many really really stupid ways.
Blinker buttons on the steering wheel. Hidden emergency door handles, emergency breaking for no reason. Distracting screen interface. In Denmark 30% of Tesla 3 fail their first 4 year safety check.
There have been stats publicized that claim they aren't worse than other cars, when in fact "other cars" were an average of 10 year older. So the newer cars obviously ought to be safer because they should be in better conditions.
As much as i want to hate on tesla, seeing this, it hardly seems like a fair test.
From the perspective of the car, it's almost perfectly lined up with the background. it's a very realistic painting, and any AI that is trained on image data would obviously struggle with this. AI doesn't have that human component that allows us to infer information based on context. We can see the boarders and know that they dont fit. They shouldn't be there, so even if the painting is perfectly lines up and looks photo realistic, we can know something is up because its got edges and a frame holding it up.
This test, in the context of the title of this article, relies on a fairly dumb pretense that:
This doesnt just affect teslas. This affects any car that uses AI assistance for driving.
Having said all that.... fuck elon musk and fuck his stupid cars.
I am fairly dumb. Like, I am both dumb and I am fair-handed.
But, I am not pretentious!
So, let's talk about your points and the title. You said I had fairly dumb pretenses, let's talk through those.
This does just impact Teslas, because they do not use LiDAR. To my knowledge, they are the only popular ADAS in the American market that would be fooled by a test like this.
Near as I can tell, you're basically wrong point by point here.
Excuse me.
Did you write the article? I genuinely wasn't aiming my comment at you. It was merely commentary on the context that is inferred by the title. I just watched a clip of the car hitting the board. I didn't read the article, so i specified that i was referring to the article title. Not the author, not the article itself. Because it's the title that i was commenting on.
That wasn't an 18 wheeler, it was a ground level board with a photorealistic picture that matched the background it was set up against. It wasnt a mural on a wall, or some other illusion with completely different properties. So no, i think this extremely specific set up for this test is unrealistic and is not comparable to actual scientific research, which i dont dispute. I dont dispute the fact that the lack of LiDAR is why teslas have this issue and that an autonomous driving system with only one type of sensor is a bad one. Again. I said i hate elon and tesla. Always have.
All i was saying is that this test, which is designed in a very specific way and produces a very specific result, is pointless. Its like me getting a bucket with a hole in and hypothesising that if i pour in waterz it will leak out of the hole, and then proving that and saying look! A bucket with a hole in leaks water...
Except for, you know.. cars that don't solely rely on optical input and have LiDAR for example
Fair point. But it doesn't address the other things i said, really.
But i suppose,based on already getting downvoted, that I've got a bad take, either that or people who are downvoting me dont understand i can hate tesla and elon, think their cars are shit and still see that tests like this can be nuanced. The attitude that paints with a broad brush is the type of attitude that got trump elected....
No, it's just a bad take. Every other manufacturer of self driving vehicles (even partial self driving, like automatic braking) uses LiDAR because it solves a whole host of problems like this. Only Tesla doesn't, because Elon thinks he's a big brain genius. There have been plenty of real world accidents with less cartoonish circumstances involving Teslas that also would have been avoided if they just had LiDAR sensors. Mark just chose an especially flashy way to illustrate the problem. Sometimes flashy is the best way to get a point across.
I agree the wall is convincing and that it’s not surprising that the Tesla didn’t detect it, but I think where your comment rubs the wrong way is that you seem to be letting Tesla off the hook for making a choice to use the wrong technology.
I think you and the article/video agree on the point that any car based only on images will struggle with this but the conclusion you drew is that it’s an unfair test while the conclusion should be that NO car should rely only on images.
Is this situation likely to happen in the real world? No. But that doesn’t make the test unfair to Tesla. This was an intentional choice they made and it’s absolutely fair to call them on dangers of that choice.
That's fair.
I didn't intend to give tesla a pass. I hoped that qualifying what i said with a "fuck tesla and fuck elon" would show that.
But i didn't think about it that way.
In my defense my point was more about saying "what did you expect" the car to do in a test designed to show how a system that is not designed to perform a specific function cant perform that specific function.
We know that self driving is bullshit, especially the tesla brand of it. So what is Mark's test and video really doing?
But on reflection, i guess there are still a lot of people out there that dont know this stuff, so at the very least, a popular channel like his will go a longway to raising awareness of this sort of flaw.
In this case, yes, but in general, downvotes just mean your take is unpopular. The downvotes could be from people who don't like Tesla and see any defense of Tesla as worthy of downvotes.
So good on you for making the point that you believe in. It's good to try to understand why something you wrote was downvoted instead of just knee-jerk assuming that it's because it's a "bad take."
I agree that this just isn't a realistic problem, and that there are way more problems with Tesla's that are much more realistic.
Tell that to the guy who lost his head when his Tesla thought a reflective semi truck was the sky
I'm so glad I wasn't the only person who immediately thought "This is some Wile E. Coyote shit."
I mean, it is also referenced in the article and even in the summary from OP.
And extensively in the video too.