this post was submitted on 03 Nov 2024
272 points (98.9% liked)
Technology
59174 readers
2401 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Intel realized it back then too, but things didn't pan out the way they wanted.
nVidia and AMD were going to merge while ATi was circling the drain. Then Jensen and Hector Ruiz got into their shitfight about who was going to be CEO of the marged AMD/nVidia (it should have been Jensen, Hector Ruiz is an idiot) which eventually terminated the merger.
AMD, desperately needing a GPU side for their 'future is fusion' plans, bought the ailing ATi at a massive premium.
Intel was waiting for ATi to circle the drain a little more before swooping in and buying them cheap, AMD beat them to it.
That’s a slightly revisionist history. ATI was by no means “circling the drain”, they had a promising new GPU architecture soon to be released, and remember this because I bought ATI stock about 6 months before the merger.
They had strong iGPU performance, a stronger process node, and tons of cash. There's no reason they couldn't have built something from the ground up, they were absolutely dominating the CPU market. AMD didn't catch up until 2017 or so when they launched the new Zen lineup.
Intel sat on their hands raking in cash for 10+ years before actually getting serious about things, and during that time, Nvidia was wiping the floor w/ AMD. There's absolutely no reason Intel couldn't have taken over the low-end GPU market with a super strong iGPU, and used the same architecture for a mid-range GPU. I bought Intel laptops w/o a dGPU because the iGPU was good enough for light gaming. I stopped once AMD's APUs caught up (bought the 3500U), and I don't see a reason why I'll consider Intel for a laptop.
Intel lost because they sat on their hands. They were late to making an offer on ATI, they were late in building their own GPUs, and they're still late on anything touching AI. They were dominant for well over a decade, but instead of doing R&D on areas near their core competencies (CPUs), they messed around with SSD and other random stuff.
They needed the IP.
You can't just build a 3D accelerator. It's billions of dollars in licensing basic building blocks.
Easiest way to get in is to buy your way in.
Yet they were capable of building one over the last decade or so with their Arc GPUs. I'm saying they should have started a decade earlier.