this post was submitted on 19 Jan 2024
160 points (96.0% liked)

Technology

58451 readers
5673 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Oha@lemmy.ohaa.xyz 87 points 8 months ago (9 children)

what the fuck is a "AI PC"?

[–] Sabata11792@kbin.social 46 points 8 months ago (2 children)

They get to put a sticker on that inflates the value by $600, then fill it with spyware.

[–] assassinatedbyCIA@lemmy.world 14 points 8 months ago (1 children)

‘CApItALiSm BreEdS INnoVAtION’

load more comments (1 replies)
[–] Oha@lemmy.ohaa.xyz 5 points 8 months ago (1 children)
[–] Sabata11792@kbin.social 7 points 8 months ago

Found the shareholder.

[–] LodeMike@lemmy.today 41 points 8 months ago (1 children)

It means “VC money now 🥺🥺”

[–] Deceptichum@kbin.social 10 points 8 months ago (1 children)

Microsoft is chasing VC money now?

[–] maynarkh@feddit.nl 11 points 8 months ago

Not VC, more like hedge funds and institutional investors. But yes, all public companies work primarily for higher share prices, and then everything else. I've experienced a public US company paying more than a million USD to save 300k just so they can put out good articles about themselves that they kept promises to shareholders.

[–] LemmyIsFantastic@lemmy.world 19 points 8 months ago

Branding. It's just saying it's capable of handling local models on copilot.

If I have to deal with Blockchain cloud computing IoT bullshit as a software engineer, I want everyone else to feel my buzzword pain in the tech they use.

[–] fidodo@lemmy.world 7 points 8 months ago

I guess a PC with a graphics card?

[–] cholesterol@lemmy.world 7 points 8 months ago

The new 'VR Ready'

load more comments (3 replies)
[–] maynarkh@feddit.nl 57 points 8 months ago

Thus, Windows will again be instrumental in driving growth for the minimum memory capacity acceptable in new PCs.

I love that the primary driver towards more powerful hardware is Windows just bloating itself bigger and bigger. It's a grift in its own way, consumers are subsidizing the requirements for Microsoft's idiotic data processing. And MSFT is not alone in this, Google doing away with cookies also conveniently shifts away most ad processing from their servers into Chrome (while killing their competition).

[–] cmnybo@discuss.tchncs.de 34 points 8 months ago (1 children)

At least it should result in less laptops being made with ridiculously small amounts of non upgradable RAM.

Requiring a large amount of compute power for AI is just stupid though. It will probably come in the form of some sort of dedicated AI accelerator that's not usable for general purpose computing.

[–] throws_lemy@lemmy.nz 17 points 8 months ago (1 children)

And remember that your data and telemetry are sent to Microsoft servers to train Copilot AI. You may also need to subscribe to some advanced AI features

[–] DontMakeMoreBabies@kbin.social 6 points 8 months ago (2 children)

And that's when I'll start using Linux as my daily driver.

Honestly installing Ubuntu is almost idiot proof at this point.

[–] BaardFigur@lemmy.world 2 points 8 months ago
[–] throws_lemy@lemmy.nz 2 points 8 months ago* (last edited 8 months ago) (1 children)

I do agree with you, the obstacle is that there are many applications that are not available on Linux or they're not as powerful as on Windows. As for me is MS. Excel, many of my office clients use VBA in Excel spreadsheet to do calculations.

[–] Reptorian@lemmy.zip 2 points 8 months ago* (last edited 8 months ago)

At least we might have a finally viable replacement in Photoshop soon. GIMP is getting NDE, Krita might be getting foreground extraction tool at some point, and Pixellator might have better tools though it's NDE department is solid. The thing is all of them are missing something, but I'm betting on GIMP after CMYK_Student arrival to GIMP development.

I tried adding foreground selection based on guided selection, but was unable to fix noises on in-between selection and was unable to build Krita. We would have Krita with foreground selection if it weren't for that.

[–] thecrotch@sh.itjust.works 28 points 8 months ago (1 children)

Microsoft is desperate to regain the power they had on the 00s and is desperately claiming trying to find that killer app. At least this time they're not just copying apples homework.

[–] Toribor@corndog.social 2 points 8 months ago (1 children)

They either force it on everyone or bundle it in the enterprise package businesses already pay for and then raise the price.

It never works, but maybe this time it will. I mean it won't... But maybe.

[–] tias@discuss.tchncs.de 2 points 8 months ago* (last edited 8 months ago)

And maybe that's why it isn't working. They try too hard to persuade or force you, giving people icky feelings from the get go... and they try too little to just make a product that people want.

[–] frankpsy@lemm.ee 25 points 8 months ago (1 children)

Apple: what's wrong with just 8GB RAM?

[–] douglasg14b@lemmy.world 26 points 8 months ago (8 children)

Yeah, and solder it onto the board while you're at it! Who ever needs to upgrade or perform maintenance anyways?

load more comments (8 replies)
[–] DumbAceDragon@sh.itjust.works 21 points 8 months ago (1 children)

"Wanna see me fill entire landfills with e-waste due to bullshit minimum requirements?"

"Wanna see me do it again?"

[–] archomrade@midwest.social 5 points 8 months ago

All I can think of:

Hi kids, do you like violence? Wanna see me stick nine-inch nails through each one of my eyelids? Wanna copy me and do exactly like I did? Try 'cid and get fucked up worse than my life is?

[–] furzegulo@lemmy.dbzer0.com 14 points 8 months ago (2 children)

no ai ain't gonna come into my pc

[–] Shurimal@kbin.social 9 points 8 months ago

Unless it's locally hosted, doesn't scan every single file on my storage and doesn't send everything I do with it to the manufacturer's server.

[–] Secret300@sh.itjust.works 3 points 8 months ago (1 children)

Personally I really want it to but only locally run AI like lamma or whatever it's called

[–] evranch@lemmy.ca 2 points 8 months ago (3 children)

Do it, it's easy and fun and you'll learn about the actual capabilities of the tech. Started a week ago and I'm a convert on the utility of local AI. Got to go back to Reddit for it but r/localllama has tons of good info. You can actually run useful models at a conversational pace.

This whole thread is silly because VRAM is what you need, I'm running some pretty good coding and general knowledge models in a 12GB Radeon. Almost none of my 32GB system ram is used lol either Microsoft is out of touch or hiding an amazing new algorithm

Running in system ram works but the processing is painfully slow on the regular CPU, over 10x slower

load more comments (3 replies)
[–] query@lemmy.world 12 points 8 months ago (2 children)

AI PC sounds like something that will be artificially personal more than anything else.

load more comments (2 replies)
[–] nyakojiru@lemmy.dbzer0.com 11 points 8 months ago* (last edited 8 months ago)

They are making for a long time now, a massive slow effort to make end users finally migrate to Linux (and I’m a whole life windows guy)

[–] HidingCat@kbin.social 9 points 8 months ago (1 children)

Great, so it'll take AI to set 16GB as minimum.

I still shudder that there are machines still being sold with 8GB RAM, that's just barely enough.

[–] douglasg14b@lemmy.world 2 points 8 months ago* (last edited 8 months ago)

It's honestly crazy to think about that we used to say the same about 4GB only 5-7 years ago...

And the same about 2GB a measly 10 years ago...

5 years ago I used to think 32GB was great. Now I regularly cap out and start page filing doing my normal day-to-day work on 48GB. It's crazy now.

[–] SomeGuy69@lemmy.world 7 points 8 months ago

Low amount of ram becomes the AI detox mechanism of this century.

[–] irdc@derp.foo 5 points 8 months ago (1 children)

Ah good. Now I know what specs not to buy.

[–] LemmyIsFantastic@lemmy.world 7 points 8 months ago

You have fun sticking to ms running your 8gb of ram, that'll show em!

[–] Mango@lemmy.world 4 points 8 months ago

Is that teraflops?

load more comments
view more: next ›