this post was submitted on 10 Dec 2024
109 points (100.0% liked)

TechTakes

1491 readers
165 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

I can't wait for the spectacular implosion

you are viewing a single comment's thread
view the rest of the comments
[–] db0@lemmy.dbzer0.com 15 points 2 weeks ago (1 children)

I expect a creative destruction, like what happened with the dotcom bubble. A ton of GenAI companies will go bust and the market will be flooded with cheap GPUs and other AI hw which will be snapped on the cheap, and enthusiasts and researches will use them to make actually useful stuff.

[–] dgerard@awful.systems 11 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

these are compute GPUs that don't even have graphics ports

[–] db0@lemmy.dbzer0.com 15 points 2 weeks ago (2 children)

Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.

I'm waiting on the a100 fire sale next year

[–] JackRiddle@sh.itjust.works 6 points 2 weeks ago

I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.

I mean there are a lot of applications for linear algebra, although I admit I don't fully know in what way "AI" uses linear algebra and what other uses overlap with it.