Yes, blink is the engine Chromium uses. Since KHTML was an open source project any project based on it will have to be open source, unless of course it's just used as a library. Even in that case though blink the engine is forced to be open source even if the browser as a whole isn't. GNU licenses are considered infectious because anything containing any GNU code automatically and legally becomes open source. So KHTML being unmaintained is irrelevant.
areyouevenreal
Oppenheimer is a mainstream movie though. It's not that geeky.
If I remember correctly it's under a copy left license which makes sense given it's ultimately a derivative of KHTML.
Yeah so I also use CachyOS on a couple things and one of them also uses Cachy Browser.
Don't Firefox and Chromium already have that?
I've seen teachers use this stuff and get actually decent results. I've also seen papers where people use LLMs to hack into a computer, which is a damn sophisticated task. So you are either badly informed or just lying. While LLMs aren't perfect and aren't a replacement for humans, they are still very much useful. To believe otherwise is folly and shows your personal bias.
I am not talking about things like ChatGPT that rely more on raw compute and scaling than some other approaches and are hosted at massive data centers. I actually find their approach wasteful as well. I am talking about some of the open weights models that use a fraction of the resources for similar quality of output. According to some industry experts that will be the way forward anyway as purely making models bigger has limits and is hella expensive.
Another thing to bear in mind is that training a model is more resource intensive than using it, though that's also been worked on.
Bruh you have no idea about the costs. Doubt you have even tried running AI models on your own hardware. There are literally some models that will run on a decent smartphone. Not every LLM is ChatGPT that's enormous in size and resource consumption, and hidden behind a vail of closed source technology.
Also that trick isn't going to work just looking at a comment. Lemmy compresses whitespace because it uses Markdown. It only shows the extra lines when replying.
Can I ask you something? What did Machine Learning do to you? Did a robot kill your wife?
Even if it didn't improve further there are still uses for LLMs we have today. That's only one kind of AI as well, the kind that makes all the images and videos is completely separate. That has come on a long way too.
From what I heard they do actually put a lot of effort into simulating airplane aerodynamics at least for the smaller planes. So the flying part is kind of important.
I don't think this is strictly true. They do tweak parts of the kernel such as the CPU scheduler to deal with new CPU designs that come out which have special scheduling requirements. That's actually happened quite a bit recently with AMD and Intel both offering CPUs with asymmetric processors with big and little cores, different clock speeds, different cache, sometimes even different instructions on different cores. They also added ReFS not long ago, which may have required some kernel work.
I can understand though if they have few experienced people and way more junior devs. It would probably explain a lot to be honest. A lot of Microsoft stuff is bloated and/or unreliable.
People see AI and immediately think of ChatGPT. This is despite the fact that AI has been around far longer and does way more things including OCR and data mining. It's never been AI that's the problem, but rather certain uses of AI.