this post was submitted on 19 Nov 2023
47 points (79.7% liked)

Unpopular Opinion

6292 readers
12 users here now

Welcome to the Unpopular Opinion community!


How voting works:

Vote the opposite of the norm.


If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.



Guidelines:

Tag your post, if possible (not required)


  • If your post is a "General" unpopular opinion, start the subject with [GENERAL].
  • If it is a Lemmy-specific unpopular opinion, start it with [LEMMY].


Rules:

1. NO POLITICS


Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.


2. Be civil.


Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Shitposts and memes are allowed but...


Only until they prove to be a problem. They can and will be removed at moderator discretion.


5. No trolling.


This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.



Instance-wide rules always apply. https://legal.lemmy.world/tos/

founded 1 year ago
MODERATORS
 

It seems most people are on board with the idea that AI will change the world. While I agree it having some impact, I also think it is overinflated by marketing. Operating an AI takes huge computing power, which costs heaps of money and energy. So how are people suggesting that exponential improvement is feasible? I do not get it.

Further, aren't we supposed to reduce energy usage? Why are we trying to overspend what little is left? I hate how this is taking priority over the environment.

Creating this post mainly to rant, I thought OpenAI firing Sam Altman was a signal for a reality check. It seems they are wrapping it up and trying to rehire him though.. What a drama.

all 17 comments
sorted by: hot top controversial new old
[–] sbv@sh.itjust.works 26 points 11 months ago (2 children)

Some of it is overinflated marketing, but for organizations trying to cut costs it could have a significant effect on a lot of their employees.

AI doesn't need to be good. It just needs to be cheaper and good enough.

[–] partial_accumen@lemmy.world 8 points 11 months ago* (last edited 11 months ago) (1 children)

So most people are assuming AI will do all the work of a job. Maybe it will someday, but my experience today with it appears to be able to do 80% of the work with only 20% human effort put in. So no, its not doing 100% of the work, it doing 80%, but it does that 80% in seconds for what used to take me hours or days.

That is a huge improvement over no AI use at all.

[–] stevedidWHAT@lemmy.world 13 points 11 months ago

Improvement for who

[–] someacnt@sopuli.xyz 1 points 11 months ago

Yea, I mostly mean the AGI nonsense. There are jobs where AI is helpful - tho imo it is worthy to point out that not all of it is purely benefit of AI.

[–] squirmy_wormy@lemmy.world 21 points 11 months ago

Id argue this isn't unpopular to anyone who knows that "AI" is just pattern matching and marketing to people who don't understand tech.

People should actively be skeptical.

[–] cm0002@lemmy.world 18 points 11 months ago (1 children)

It's unsustainable right now because hardware and software are not aligned (yet). Software is currently out pacing hardware, but there are loads of companies working on specialist chips that will deal with the computing power problem and the energy consumption problem just by the shear factor of optimization benefits.

Plus, software optimizations are also well under way and models are always being fine tuned to run better/train better with less

[–] someacnt@sopuli.xyz 2 points 11 months ago (1 children)

I doubt how good results that could achieve. I agree that 10~100 times improvement is feasible by optimizing the hardware. But the hardware in general need to be improved, yet the impenetrable barrier light speed is blocking on the way.

And more complete AI systems should require hundreds of thousands times the computation power. Really, this has the same issue as bitcoin.

[–] M500@lemmy.ml 5 points 11 months ago (1 children)

I think the specialized hardware for this task will be better than you expect. It’s like using a sledgehammer to carve something. Pretty soon a chisel will be given to the computer and it will be able to do its job much easier.

[–] someacnt@sopuli.xyz 3 points 11 months ago

I doubt it since GPU was already not a bad tool for this job. The generality of GPGPU helped a lot here.

[–] Teppic@kbin.social 8 points 11 months ago* (last edited 11 months ago)
[–] qupada@kbin.social 6 points 11 months ago (1 children)

Meanwhile...

https://www.theregister.com/2023/10/11/github_ai_copilot_microsoft/

[...] while Microsoft charges $10 a month for the service, the software giant is losing $20 a month per user on average and heavier users are costing the company as much as $80 [...]

Mmm hmmm.

This could be one form of "course correction"; few people are going to care to participate if they're forced to pay what it actually costs.

[–] Schal330@lemmy.world 4 points 11 months ago

I suspect this is all part of the long term plan; provide the service at a reduced fee so people gain reliance on the tech, then increase the cost over time. We see this happen everywhere.

[–] ttmrichter@lemmy.world 2 points 11 months ago

Current AI isn't in any meaningful sense "intelligent". It's all smoke, mirrors, horses, and ponies put out on a fancy performance designed to transfer money from the public purse (directly or indirectly) into the pockets of sociopathic billionaires.

[–] Karlos_Cantana@sopuli.xyz 1 points 11 months ago

The "current gen AI" is the key here. How sustainable it is depends on how quickly it can grow and improve. Technology is growing much faster than in the past. I remember getting a dictation program in 1998. I had to spend 2 hours talking to it so it could learn my voice. Even after all that, it still only had about a 25% success rate in properly transcribing my text. In 2015 I bought my first smart watch. The first voice transcription I made from it was 100% correct with absolutely no learning of my voice at all.

I believe that the LLM will quickly give way to a different type of AI. There may be several different approaches to AI before something really takes hold and changes the game.

[–] deafboy@lemmy.world 1 points 11 months ago (1 children)

Operating an AI takes huge computing power.

For now. There are already plans to accelerate some specific machine learning workloads on next generations of low powered mobile chips. Think ChatGPT on a smartphone.

For other use-cases, you don't even need to wait. Google Coral can do object recognition for your security camera feed, using minuscule amount of power, compared to a GPU.

[–] nodsocket@lemmy.world 1 points 11 months ago

This is definitely true, but keep in mind that there is a limit to how far you can optimize a chip. Eventually we could have everything running on ASICs, but electronics do have a maximum speed that we may not be far from reaching.