this post was submitted on 04 Jan 2024
-21 points (34.8% liked)

Technology

59092 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 19 comments
sorted by: hot top controversial new old
[–] TheGrandNagus@lemmy.world 23 points 10 months ago (1 children)

This is just based on a survey of researchers (not even AI researchers specifically) where they were asked if they think it'll happen or not and they gave anything from vague predictions to complete gut-feeling guesses back.

It's a worthless article. Not that I even needed to click to find that out.

[–] themeatbridge@lemmy.world 5 points 10 months ago* (last edited 10 months ago) (1 children)

83% of statisticians polled say you cannot predict future events from the statistical results of an opinion poll, which means you can still make those predictions 17% of the time. Maybe this is one of that 17%?

[–] hoot@lemmy.ca 3 points 10 months ago (1 children)

67% of statistics are made up

[–] themeatbridge@lemmy.world 5 points 10 months ago

Bartlet: Sweden has a 100% literacy rate, Leo. 100%! How do they do that?

McGarry: Well, maybe they don't and they also can't count.

[–] JoMomma@lemm.ee 8 points 10 months ago* (last edited 10 months ago)

+/-95%

"AI experts’ predictions should not be seen as a reliable guide to objective truth"

[–] hoot@lemmy.ca 7 points 10 months ago (1 children)

"Researchers say"

Researchers also say things they pulled out of their ass 90% of the time.

Source: I researched it

[–] GigglyBobble@kbin.social 1 points 10 months ago

“Researchers say” isn't something researchers say though but journalists or "journalists" (bloggers etc).

[–] drdiddlybadger@pawb.social 5 points 10 months ago

Guys who's job depends on hype around their field continue to hype their field. News at 11.

All that ad space just to leave out the whys.

[–] originalucifer@moist.catsweat.com 4 points 10 months ago

this is just a subset of the 100% chance humans will cause human extinction.

'by ai' is just one of very many options were looking into..

[–] HeavyDogFeet@lemmy.world 4 points 10 months ago

What are the chances that it won’t cause extinction, but will just make things a bit (or maybe a lot) worse for a lot of people?

I don’t think any reasonable people are all that concerned with doomsday scenario as much as with the slow march towards a bullshit future.

[–] silverbax@lemmy.world 3 points 10 months ago

So, 95% chance that humans will cause human extinction.

And humans created AI, so even if AI does in the human race, it will still have been humans.

I guess if humans go extinct, it's close to 100% due to humans.

[–] NOT_RICK@lemmy.world 3 points 10 months ago

We’re good so long as nobody rolls a nat 1

[–] autotldr@lemmings.world 2 points 10 months ago

This is the best summary I could come up with:


In the short term, researchers estimate AI will become significantly more advanced, able to create a Top 40 pop song and write an NYT bestseller before 2030.

“While AI experts’ predictions should not be seen as a reliable guide to objective truth, they can provide one important piece of the puzzle,” said researchers from Berkeley and the University of Oxford who conducted the study in December.

“Their familiarity with the technology and the dynamics of its past progress puts them in a good position to make educated guesses about the future of AI.”

The study highlights the perceived danger around creating a powerful artificial intelligence from the world’s leading researchers.

Anthropic has a constitution laid over its AI systems to ensure they act in alignment with our society’s rules.

Experts anticipate that AI will be able to assemble LEGOs, translate newfound languages, and build video games before 2033.


The original article contains 402 words, the summary contains 148 words. Saved 63%. I'm a bot and I'm open source!

[–] Wiitigo@lemmy.world 2 points 10 months ago

I like them odds

[–] LainOfTheWired@lemy.lol 1 points 10 months ago (1 children)

Didn't a bunch of "researchers" think the world was going to end in 2012 for some reason?

[–] ripcord@kbin.social 5 points 10 months ago

No?

I mean, if you stretch those airquotes pretty hard, but not sure why you'd compare those people to this case

[–] hal_5700X@lemmy.world 1 points 10 months ago (1 children)

You have nothing to be afraid of, Dave.

Can it happen, maybe. Will it happen. Beats me, man.

[–] WHYAREWEALLCAPS@kbin.social 2 points 10 months ago

The more I think about it, the more I suspect it'll be completely by accident. Some AI designed drug will pass all trials, get approved, be used for years or decades only to find out that some bit of it kind of acts like a generic prion that affects all life. Oh, and that bit also passes right through you, too, so by the time they figure it out, the pseudo-prion is already out there in the wild, infecting fish and other aquatic creatures. And before long the ecosystems of the world's oceans and lakes collapse. Meanwhile land animals also start dying off due to their drinking supply being polluted by it, so a full scale ecological collapse begins. As the pseudoprion sticks around indefinitely, every attempt by nature to evolve new life ends because of it. Eventually it gets buried by all the detritus of time and new life does once again rise only to have humanity's ticking time bomb waiting in the ground for something to dig it up and start the whole cycle all over again...