this post was submitted on 14 Jul 2024
483 points (96.5% liked)

Technology

59346 readers
7298 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI is overhyped and unreliable -Goldman Sachs

https://www.404media.co/goldman-sachs-ai-is-overhyped-wildly-expensive-and-unreliable/

"Despite its expensive price tag, the technology is nowhere near where it needs to be in order to be useful for even such basic tasks"

@technology@lemmy.world

you are viewing a single comment's thread
view the rest of the comments
[–] gedaliyah@lemmy.world 57 points 4 months ago (5 children)

I remember saying a year ago when everybody was talking about the AI revolution: The AI revolution already happened. We've seen what it can do, and it won't expand much more.

Most people were shocked by that statement because it seemed like AI was just getting started. But here we are, a year later, and I still think it's true.

[–] Sterile_Technique@lemmy.world 23 points 4 months ago (1 children)

Those people were talking about the kind of AI we see in sci-fi, not the spellchecker-on-steroids we have today. There used to be a distinction, but marketing has muddied those waters. The sci-fi variety has been rebranded "AGI" so I guess the rest of that talk would go right along with it - the 'AGI singularity' and such.

All still theoretically possible, but I imagine climate will take us out or we'll find some clever new way to make ourselves extinct before real AI ...or AGI... becomes a thing.

[–] jaybone@lemmy.world 11 points 4 months ago

Given AI’s energy needs, it’s already helping to take us out.

[–] OutlierBlue@lemmy.ca 13 points 4 months ago (5 children)

The AI revolution already happened. We’ve seen what it can do, and it won’t expand much more.

That's like seeing a basic electronic calculator in the 60s and saying that computing won't expand much more. Full-AI isn't here yet, but it's coming, and it will far exceed everything that we have right now.

[–] HackyHorse3000@lemmy.world 26 points 4 months ago (1 children)

That's the thing though, that's not comparable, and misses the point entirely. "AI" in this context and the conversations regarding it in the current day is specifically talking about LLMs. They will not improve to the point of general intelligence as that is not how they work. Hallucinations are inevitable with the current architectures and methods, and they lack a inherent understanding of concepts in general. It's the same reason they can't do math or logic problems that aren't common in the training set. It's not intelligence. Modern computers are built on the same principals and architectures as those calculators were, just iterated upon extensively. No such leap is possible using large language models. They are entirely reliant on a finite pool of data to try to mimic most effectively, they are not learning or understanding concepts the way "Full-AI" would need to to actually be reliable or able to generate new ideas.

[–] gedaliyah@lemmy.world 21 points 4 months ago

Oh, I'm not saying that there won't one day come a better technology that can do a lot more. What I'm saying is that the present technology will never do much more than it is already doing. This is not an issue of refining the technology for more applications. It's a matter of completely developing a new type of technology.

In areas of generative text, summarizing articles and books, as well as writing short portions of code in order to assist humans, creating simple fan art, and meaningless images like avatars, and those stock photos at the top of articles, Perhaps creating short animations, Improving pattern recognition of things like speech and facial recognition… In all of these areas, AI was very rapidly revolutionary.

Generative AI will not become capable of doing things that it's not already doing. Most of what it's replacing are just worse computer programs. Some new technology will undoubtedly be revolutionary in the way that computers were a completely new revolution on top of basic function calculators. People are developing quantum computers, and mapping the precise functions of brain cells. If you want, you can download a completely mapped actual nematode brain right now. You can buy brain cells online, even human brain cells, and put them into computers. Maybe they can even run Doom. I have no idea what the next computing revolution will be capable of, but this one has mostly run its course. It has given us some very incredible tools in a very narrow scope, and those tools will continue to improve incrementally, but there will be no additional revolution.

[–] turmacar@lemmy.world 18 points 4 months ago* (last edited 4 months ago) (1 children)

Sure.

GPT4 is not that. Neither will GPT5 be that. They are language models that marketing is calling AI. They have a very specific use case, and it's not something that can replace any work/workers that requires any level of traceability or accountability. It's just "the thing the machine said".

Marketing latched onto "AI" because blockchain and cloud and algorithmic had gotten stale and media and CEOs went nuts. Samsung is now producing an "AI" vacuum that adjusts suction between hardwood and carpet. That's not new technology. That's not even a new way of doing that technology. It's just jumping on the bandwagon.

[–] aesthelete@lemmy.world 2 points 4 months ago

Marketing latched onto “AI” because blockchain and cloud and algorithmic had gotten stale and media and CEOs went nuts.

Notably, this also coincided with the first higher interest rate environment in the broader economy in over a decade.

[–] ChickenLadyLovesLife@lemmy.world 8 points 4 months ago

That’s like seeing a basic electronic calculator in the 60s and saying that computing won’t expand much more.

"Who would ever need more than 640K of RAM?" -Bill Gates

[–] raspberriesareyummy@lemmy.world -4 points 4 months ago* (last edited 4 months ago)

Full-AI isn’t here yet, but it’s coming, and it will far exceed everything that we have right now.

go back to school, hopefully your next statement won't sound as dumb.

[–] SlopppyEngineer@lemmy.world 6 points 4 months ago

AI development is indeed a series of S-curves and we're currently nearing the peak of the curve. It's going to be some time before the new S begins.

[–] SeattleRain@lemmy.world 2 points 4 months ago (1 children)

It'll expand but it will take 5-10 years. Just like Web 1.0 and 2.0.

[–] cley_faye@lemmy.world 8 points 4 months ago

Not with the current tech. It can go faster, have more detailed output, maybe consume less too, but there seems to be a ceiling on what LLM and their derivative can do. There has been no improvement in that regard, and people that look into it are pretty confident that it won't happen at this point.

[–] Uplink@programming.dev 2 points 4 months ago

I think it all depends on how good our tools to detect AI generated content become. If it is not distinguishable, then the internet is probably about to be flooded by AI generated content which in turn means AI is going to be trained more and more with AI content, degrading the model in the process.