What the fuck is wrong with you guys. This is absolutely dystopian shit right there.
This is not "nice" or "neat"?!
It's straight up awful. It's war.
This is a most excellent place for technology news and articles.
What the fuck is wrong with you guys. This is absolutely dystopian shit right there.
This is not "nice" or "neat"?!
It's straight up awful. It's war.
A system that can maneuver autonomously is dystopian? Lol, what?
One can find interest in an objects technological design while still acknlowledging it's horror when put to practical use. They aren't mutually exclusive options.
The tank-like robot has the ability to transport itself to a preset destination. It can also spot and avoid obstacles by utilizing dozens of sensors and an advanced driving system. Moreover, the platform can self-destruct if it falls into enemy hands.
It is not an autonomous weapons system. It is a platform that can maneuver autonomously.
Targeting for the weapon itself is done by a human remotely at least right?
...
...right?
*eta: yeah, it looks like it has a remote driver who can take over the steering and control the gun with a little PS4 controller thingy
How long until there is a version that would let the operator upload a photo of the target and the gun bot would seek and shoot that target automatically, with 99.9% face detection accuracy?
Turkey says they have flying drones who already have done that, so ...
One more step towards the inevitable weapons-free platform that will eventually come.
Braindead militarists never cease to believe that "only we have it, and therefore..."
The truth is that there will be a life after the war, and the war makes it miserable.
Yes, for both sides.
Semi-autonomous doesnt really mean anything and is a deliberately sensationalist headline.
The key technological discussion is when it's not a human pulling the trigger.
Even guns are semi autonomous by this definition
The tank-like robot has the ability to transport itself to a preset destination. It can also spot and avoid obstacles by utilizing dozens of sensors and an advanced driving system. Moreover, the platform can self-destruct if it falls into enemy hands.
I'd have to assume it has some sort of finding/tracking tech as well to stay on target. Trying to compare this to a handgun is just silly.
The thing is, will it select a target and fire without manual intervention? I'm less worried about it moving autonomously than killing. Not that in this case I think it will make any difference if a civilian enters l encounters it.
The CIWS Phalanx has been doing this since the 70s. The main point is that it's implying something new about it's semi-autonomous nature that hasnt previously been used and is thus noteworthy. A handgun automates much of labour involved with applying kinetic force to another human being, reducing it down to a button press that anyone can do.
Trying to suggest this is somehow newsworthy and that they arent just fishing for clickbait headlines is silly.
Australia already has area-denial sentries that autonomously shoot at any motion (with some parameters regarding size and speed).
These, or a similar techology was used for a while along the Korean DMZ until we started talking about building autonomous drones.
One of the shot-down airliner incidents (Flight 007, maybe?) involved a misdesignation of a sensor contact by a US Aegis missileboat system. The plane was pinging with F4 Phantom Radar (which ruled out an ordinary airliner). The Aegis required a human to authorize an attack, but it reported the contact as a bogey (unknown, peresumed to be hostile)...
— Apparently, I posted this without finishing it. —
So that instance might be considered the first historical case of an autonomous weapon system accidentally killing a civilian (at least partially civilan) target, given the human doing the authorizing had inadequate data to make an informed decision.
(A lot of cruelty of our systems comes from authorizations based on partial data. Law enforcement in the US is renowned for massaging their warrants to make them easy on the signing magistrate, resulting in kids and dogs slain during SWAT raids in poor neighborhoods. I'm ranting.)
"and prevents risks to human life"... no implications there, I'm sure.
This story is from 2021.
Yes, but more relevant now than then, no?
Almost 20 years ago (because bad ties) I was presented with a military video for the development of this horror.
Israelis ~~where~~ were very proud in the video to show this thing driving in front of a poor house and firing from outside through the wall to anything (eventually) living inside. They didn't give a fuck whether it was (to be) woman children or else. (of course the house was empty for the research phase, well I hope so).
This is the kind of monstrosity Palestinians are facing now.
I was hoping to participate in this conversation with some long, interesting back and forths with different people about this inevitable, emerging technology. Then I scrolled the comments section...
Let's hope that the "AI" doing the aiming was programmed by Microsoft. That way, it would at least not hit anybody....
Guess I'm done talking shit about Clippy.
It looks like you are trying to violate the Geneva Convention. Would you like help?
☐ Do not show this message again
Looks like your trying to oppress a population, would you like Microsoft Bing AI to draft a press releases to establish a narrative justifying automated anti personal weapons deployed against civilians?
Introducing our new Stormtrooper™ AI!
Scary, but neat.
Here we go. I can't wait for the Boston Dynamics one to be outfitted with a .50 and drone support.
That's going to be totally cool and amazing watching it launch its 300 lb body off stuff all Parkour like 360 no scoping dudes while the drone drops grenades.
/s incase you need it. This is totally going to suck.
If the operators can select targets and the drone locks on and fires, that crosses into a high-risk moral gray-zone, since UIs are susceptible to misclicks and the drone may not consider collateral consequences (such as overpenetration.
If the drone can autonomously target and attack on its own algorithms, add to that the inevitable miscalculations meaning it will eventually kill a target that a soldier would not.
In the meantime anarchists amd revolutionaries should examine how to convice it it's been compromised to convince it to self-destruct.
And if it requires a signal from home to auto-destruct, how to block the affirmative signal.
I suspect the GLA is going to develop thick, sticky smoke bombs and signal jammers to make our ground drone blind and isolated. Then it can be neutralized and salvaged for parts.
What is a drone for 500.
OK but drones are only allowed to shoot drones. Military companies still win, as this is what's is all about right?
A few weeks later.
Aw man, someone coded the routine that checks if the person is from Hamas to always return true. Woopsie Doodle.
for hamas in crowd:
open_fire()
With proper ammunition and fire control that could actually work. It's kind of surprising how little development there has been in this space. A civilian smart rifle has been around for almost a decade.
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source; check me out at GitHub.