Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
The correct answer is "whenever you discovered there was an alternative". Windows has always been shit, but before you thought there was no alternative so you were used to it, ever since you started using something different you've grown less tolerant of problems. It's like someone who's always had a low end PC and played games on minimum at 30fps, it's "okay" but the moment you play something on maximum at 144fps your normal experience feels sluggish and bad (even though nothing really changed with it).
I think windows is the same thing, which is why most people will tell you the last good version of windows was the one they were using when they migrated over to Linux.
How was windows XP bad? It did all I asked it to do, it was compatible with all the software I needed and, in general, "it just works". I remember trying openSUSE back in the day, and being underwhelmed by it. Then I ran Kubuntu for a bit but, even though it had cool software for listening to music and such, I couldn't use it to game. So I went back to windows because Linux just didn't have anything for me.
Nowadays, I'd completely agree. Win10 does whatever it wants when it wants, even when it seems mostly tamed. It's not terrible and it "works", but yeah I'm switching to Arch before Win11 comes, for real.
Linux has come a long way and Windows has gone down the enshittification route; but it wasn't like this back in the 00s.
XP was the response to Linux. Before that, windows was a crash fest, remember 98, or Millennium?
Linux was rock stable, so microsoft had to do something and started yo use their server core in the home version of windows.
They just realized trying to maintain NT and 9x core was foolish. Trying to put the hardware abstraction layer from Windows 2000 (NT 5) into 9x for Millennium Edition was AWFUL. So they scrapped the entire idea of a separate home core, 9x died, and Windows XP (NT 5.1) was born.
But NT was already good. Windows 2000 SP4 was a fantastic OS for its time, as was XP.
Gotta remember that the 9x core versions (95, 98, ME) were (in some ways) practically a separate OS masquerading as Windows.
So you mean "Microsoft developed Windows to be good" like OP said.
I barely remember using win98, it was the first OS I used when I was very little. But I don't remember it being so prone to crashing. At least not fatally crashing. Of course, by the time I was just playing around with paint and shareware games, not doing any serious work, so I wouldn't know if it was bad.
But that still means it isn't as straightforward as "windows was always bad, linux was always good".