I’ve heard of people coercing even my graphics card, an RX 580. However, I avoid generative AI for ethical reasons and also because Microsoft is trying to shove it down my throat. I really hope that copyright law prevails and that companies will have to be much more careful with what they include in their datasets.
data1701d
On a random note, I once talked about Firefox with a friend, and he texted me a few weeks later about trying it. More successful than my Linux spreading, although part of it was he used a Surface and having gone through the pain on my Surface Go myself, I wasn’t sure I could impart that suffering to someone else.
Solution 1: Use Synaptic while in the cafe.
Solution 2: If I’m going to get called a hacker (or cracker, if you’re some dude from the FSF), might as well earn it.
I’m trying to remember, but I feel like either the regular Debian installer or expert installer recently offered to choose between Windows-style tome and Unix-style time. Also, though, there is a registry hack for Windows to make it use UTC.
Luckily, I don’t depend on Adobe stuff, but knowing some professional photographers, you sort of can’t live without Photoshop. I feel like GIMP has severely stagnated - many of the features are there, but buried, while non-destructive editing integral to a modern workflow seems eons away. (I find this weird, especially considering how good and mostly intuitive a project like Inkscape is - I find a lot of things easier than Illustrator.) I kind of want to learn GPU shaders and GPU compute (I’m mostly a Python guy with up to Calculus II experience, some bezier curve know-how, and more math on the way, for reference, for reference) so I could create a fast open graphics editor as backlash for Adobe’s AI buffoonery, though my project management skills aren’t great at the moment.
On streaming, a partial (but admittedly not full) replacement is a sufficiently well-stocked local library. Just get the Bluray libraries (libaacs, libbdplus, etc) set up, throw in a keydb.cfg, and you’ve at least got some stuff. For me, there’s a local that keeps a good collection of Trek stuff (all the way to Lower Decks season 4 and Prodigy Season 1 from 2023), which is almost all I care about.
I feel a surge of rage every time I have to touch Windows Update.
I feel that. I also like XFCE. I chose between that and KDE on a proverbial coinflip.
I semi-agree. I did that, switching to Inkscape, Firefox, and LibreOffice in the weeks before I realized I should just make the switch. What actually helped me get the experience though was running various distros in VirtualBox, which I’d done in various forms since 2017 or so starting with Ubuntu 16.04, then going through each subsequent version up to 20.04, trying (and ultimately using as a main VM) Debian Buster, Bullseye and Bookworm (Testing at the time). In the final few weeks of daily-driving Windows, I did some VM distrohopping with Arch and NixOS before ultimately choosing Debian Bookworm Testing for my first bare metal install on my main device (it was originally intended as a test to see how I would do things if I did transition to Linux before it just turned into my main distro. On an unrelated note, I had installed Debian on an old Fujitsu Lifebook before then.). That Testing install has survived to the present day and is currently on Trixie.
It could have changed, but last I checked, I think AMD cards actually tend cheaper or about the same as Nvidia for the same specs. I’m not a cultish defender of AMD, though, as ROCm support sucks honestly (biased though because I’m bitter about Polaris being dropped so quick).
Your Thinkpad problem sounds more like some sort of power profile problem rather than an AMD GPU issue, though it could just be with Vega. I have an AMD Cezzanne Thinkpad E16 with an AMD iGPU that works very nicely, probably one of the best-working Linux devices I’ve ever owned.
AMD unless you’re actually running AI/ML applications that need a GPU. AMD is easier than NVidia on Linux in every way except for ML and video encoding (although I’m on a Polaris card that lost ROCm support [which I’m bitter about] and I think AMD cards have added a few video codecs since). In GPU compute, Nvidia holds a near-dictatorship, one which I don’t necessarily want to engage in. I haven’t ever used an Intel card, but I’ve heard it seems okay. Annecdotally, graphics support is usable but still improving for gaming. Although its AI ecosystem is immature, I think Intel provides special Tensorflow/Torch modules or something, so with a bit of hacking (but likely less than AMD) you might be able to finagle some stuff into running. Where I’ve heard these shine, though, is in video encoding/decoding. I’ve heard of people buying even the cheapest card and having a blast in FFMPEG.
Truth be told, I don’t mess with ML a lot, but Google Colab provides free GPU-accelerated Linux instances with Nvidia cards, so you could probably just go AMD or Intel and get best usability while just doing AI in Colab.
I’m just the tiniest bit mad because someone gifted me the 1st edition last Christmas right before the new one. My main hope is the 2nd edition explains the rules better. I did notice better explanations in the quick start guide Modiphius has on their website and hope for the same in the book.