WereCat

joined 1 year ago
[–] WereCat@lemmy.world 4 points 1 day ago

Apocalypsis Noctis in my ass

Considering I'm writing this while sitting on toilet after eating extra spicy Korean soup... It weirdly fits

[–] WereCat@lemmy.world 1 points 2 days ago

"The more you buy the more you save" - NVIDIA

Seems like they both went to the same school

[–] WereCat@lemmy.world 1 points 2 days ago

It appears I can in my phone app but not on PC unless I select copy link and paste it in new tab... Not sure why

[–] WereCat@lemmy.world 3 points 3 days ago (1 children)

Operation Flashpoint had IMO one of the best campaigns

[–] WereCat@lemmy.world 18 points 4 days ago

Artificial Abomination Absolutely Avoid

[–] WereCat@lemmy.world 3 points 4 days ago

Sam is great. Not fan of the newer games but I do play the older ones from time to time still

[–] WereCat@lemmy.world 14 points 4 days ago

Didn't expect this joke to get such traction but here we are

[–] WereCat@lemmy.world 11 points 5 days ago (1 children)

Indeed there are plenty of games with way more bugs than this one but bugs themselves are not the issue.

Skyrim had and still has many bugs but despite all that it's a good game.

This game will remain bad even if you'll fix all bugs.

[–] WereCat@lemmy.world 3 points 1 week ago

But I can only buy whole software...

[–] WereCat@lemmy.world 7 points 1 week ago (1 children)

Most of the time I can't tell a difference but with orchestra / classical music I can.

Also most of the time I listen to music when I'm in a factory with 75db-80db noise floor so it hardly matters how good headphones and source I'm listening to.

It's just at home where I can fully enjoy my flacs with HD 650... Not that I bother listening to them too often anyways.

I'll take good mp3 256kbps master over bad flac master any day though.

[–] WereCat@lemmy.world 4 points 2 weeks ago (11 children)

No, it's isometric

[–] WereCat@lemmy.world 5 points 2 weeks ago

You dense... You're not very bright, are you?

 

For those who are interested.

DISCLAIMER:

I DON'T KNOW IF THE GAME HAS DRM THAT WILL PREVENT YOU FROM PLAYING ON LINUX OR NOT. I DON'T OWN THE GAME, THE BENCHMARK TOOL IS FREELY AVAILABLE THOUGH AND THAT'S WHAT I'VE TESTED.

  • Fedora40
  • Ryzen 7 5800X3D
  • RX 6800 XT Sapphire Pulse
  • 4x16GB DDR4 3600MT/s (Quad Rank with manual tune)

I run some minor OC on the GPU which is 2600MHz core, no VRAM OC because it's broken on Linux, -50mV on the core and power limit set to 312W.

Results: Motion Blur OFF in all

  • 1440p High - Native
  • AVG = 65 FPS
  • Max = 75 FPS
  • Min = 55 FPS
  • Low 5th = 58 FPS

  • 1440p High - FSR 75% scaling
  • AVG = 87 FPS
  • Max = 106 FPS
  • Min = 73 FPS
  • Low 5th = 78 FPS

  • 1440p High - TSR 75% scaling
  • Avg = 85 FPS
  • Max = 100 FPS
  • Min = 70 FPS
  • Low 5th = 76 FPS

I found TSR more pleasing to my eyes even though a bit more blurry but I do find the shimmer of FSR more distracting in motion. In static scenes the FSR definitely pulls ahead in visuals.

Game looks like it's well optimized. You can probably run most settings on Very High if you're targeting just 60FPS with some upscaling. (Assuming if the game performs like the benchmark). The benchmark is also quite GPU heavy and barely put's any load on the CPU, my 5800X3D was using less than 20W for the entirety of the run. It's possible the actual game may be quite a bit more CPU heavy than that.

You can definitely set Textures to Cinematic quality without barely any performance hit if you have card with enough VRAM, the textures do look quite nice on Cinematic.

 

I've tried to switch multiple times and always found or encountered some issue that got me back to Windows (on desktop PC).

Last year it was after 2 months on Fedora 38 KDE when I had enough with the KDE Window Manager acting weird and broken unusable VRR on desktop and some other smaller but daily issues that I went back to W11 on my PC.

I like GNOME over KDE and back then there was no VRR support on GNOME so I only had to stick with KDE, now it's a different story.

I still have some minor annoyance which are probably solvable but I don't know how as I didn't put enough effort in finding solution.

Namely:

1.) Sometimes my 2nd monitor after boot remains blank and I have to unplug and plug back in the DP cable from the graphics card. Typically happens after a kernel update or restart but rarely on cold boot. I've seen others having this issue on Fedora40 but I haven't seen any solution mentioned.

2.) Steam UI hangs up sometimes for several seconds when trying to navigate fast trough it and especially if it needs to pop a different window.

3.) GPU VRAM OC is completely busted and even doing +-1MHz will result in massive artifacting even on desktop, not a big deal but I would take the extra 5% boost I can have from VRAM OC on Windows :)

4.) After every Kernel update I have to run two commands to get my GPU overclock to work again. I haven't figured out yet how to make a scrip that can read output from 1st command and copy it into 2nd command so I just do it manually every time which is roughly once a week.

5.) Free scrolling does not work in Chromium based browsers :( Luckily Vivaldi has some nice workaround with mouse gestures but I would still like free scrolling like on Windows.

And these are about the only annoyance I found worthwhile to mention.

Gaming works fine.

The apps I use typically work fine on Linux as well. Mangohud is amazing. No issues with audio unlike my last experience. Heck even Discord has no issues streaming video and audio now despite just using the web app. VRR despite being experimental works flawlessly on GNOME for me. I'm happy.

8
submitted 3 months ago* (last edited 3 months ago) by WereCat@lemmy.world to c/linux@lemmy.ml
 

SOLUTION:

I was missing this package sudo dnf install rocm-hip-devel as per instructions here: https://fedoraproject.org/wiki/SIGs/HC


Hi, I'm trying to get GPU acceleration on AMD to work in Blender 4.1 but I can't seem to be able to. From what I've seen it should be working with ROCm just fine but I had no luck with it.

I'm using Fedora 40 GNOME with Wayland and my GPU is RX 6800 XT.

System is up to date. I've also installed all these packages:

sudo dnf install rocminfo

sudo dnf install rocm-opencl

sudo dnf install rocm-clinfo

sudo dnf install rocm-hip

and restarted system after.

rocminfo gives me this

rocm-clinfo gives me this

___``___

 

When they are already finishing each others sentences.

3
submitted 7 months ago* (last edited 7 months ago) by WereCat@lemmy.world to c/linux@lemmy.ml
 

I'm unable to pair my Wacom tablet to my notebook via BT. I think I know the root cause of the issue which is the version of Bluez being 5.64 and the behavior of the issue is the same as I had on Fedora on my main PC few months back when I've tried to pair PS5 Dual Sense controller.

I was able to fix the issue on Fedora by downgrading the Bluez version with "sudo dnf downgrade bluez" but on Pop!_OS this does not work and neither I was able to upgrade to one of the newer versions 5.65 or 5.66 (it says I need to compile these?).

I'm new with Linux so I'm not sure how I would go about compiling something to make it work. The 5.65 version seems to have bugfixes for my issue but I'm perfectly fine with downgrading to anything older as long as it works if it's simpler to do.

To describe the issue:

When I put tablet into a pairing mode it will appear in the Pop!_OS system and I can select it to pair. It will attempt to pair but then fails.

I've tried to force it to pair with "bluetoothctl trust/pair/connect MacID" commands and it managed to say "connected" but the tablet was still not responding and the LED was blinking as if it was still in pairing mode. I've also tried the Blueman GTK BT manager with no luck.

Note: It works perfectly well via USB connection. It also connects instantly to my main Windows PC via BT so the tablet is not at fault.

EDIT;

FIGURED IT OUT

by default laptop was non-discoverable and there is no system setting on either Pop!_OS or Fedora to make it discoverable. Had to use Blueman to make it so. I've only noticed that when doing "sudo systemctl bluetooth" for XYth time and there was "Disoverable: No" next to one of the MacIDs. After figuring out the ID belongs to my laptop I just had to figure out how to make it discoverable and turns out Blueman can do it.

 
  • RED = Thermal Grizzly Carbonaut
  • GREEN = PTM 7950 (no idea if the original or some fake but it works, bought from random seller on Aliexpress)
  • before Arctic MX-4 was hitting TJmax 110C at 312W (good temps comparable to Carbonaut for first few days after application then drastic increase in short time due to pumpout issue)
  • stock paste was hitting TJmax 110C at 255W (when I got the card it was used and already almost 2y old)

Unfortunately I don't have any data for the MX-4 and stock paste to put in the graph.

Testing was done with TimeSpy GPU Test 2 on a loop for 10 loops.

 

Anyone can help me? I wasn't able to find any solution for this. The controller works via USB-C just fine but I only have a very short cable, I borrowed the controller from a friend to try it and don't have the original cable but I was intending to play via Bluetooth anyways.

Basically, I can find the controller via pairing mode but when I try to pair it I get error:

The Setup of Dual Sense wireless Controller has failed.

After that I can see it in available wireless devices but when I try to connect to it it will immediately disconnect again, checked with bluetoothctl.

Using on-board Intel AX200 wireless controller.

 

I switched from Windows to Fedora last week and I'm monitoring the stats with Mangohud when playing games. I used to run HWinfo on 2nd monitor when using Windows 11.

I have 6800XT. The card maintains higher clocks at lower power most of the time. I've set the same OC as on Windows with a 2700MHz max clock and in games I'm sitting pinned at 2670MHz-2700MHz almost all the time in Linux when I don't hit power limit (312W) while on Windows the actual clock barely went over 2600MHz and card was almost always bouncing off of power limit resulting in massive clock drops to 2300-2400MHz. On Linux the drops go down just by like 100MHz-130MHz at most in the same scenarios.

Unfortunately I'd need to install Windows again and do proper testing to compare but I wonder if anyone else can confirm/deny this to me.

At least on idle I can confirm for a fact that the card uses less power, usually around 30-35W while on W11 was like 40-50W.

 

But I've spent most of the time tweaking and setting up and downloading stuff rather than actually playing. Games seem to work really well. Not doing benchmarking but I really like how stable the framerate is when frame cap is in place. So far everything I've tried was absolutely buttery smooth.

 

I've got a 2nd hand RX 6800 XT about 3 months ago and immediately had to repaste it as the thermals went well above 100C even at 255W. I used a new tube of MX4 which I kept as a backup and the temp went down significantly for about 2 weeks but then hot spot started to creep up until the difference between GPU Temp and TJ Max was 30C+ at times and with my OC at 300W it was often hitting 105C-109C on TJ Max in Time Spy (but even in games the hotspot would sometimes randomly jump to over 100C even if the game was using 200W).

So my theory was that there is a thermal paste pump-out due to thermal cycling which would explain why the temp was going up so fast after repaste but it took me until now to try the Carbonaut pad which I assumed could fix the issue.

I've used Time Spy GPU Test2 on 5 loops to get these results for comparison. GPU was set to 300W and 2600MHz at stock 1150mV. GPU fan speed fixed with side panel on the case as well.

I've started in the morning so room temp went only up until I've got all my results which means that the pad results are slightly better than what I've measured.

After replacing the paste with a pad the TJ Max did go down by about 6C-9C and I was only hitting about 100C at most BUT the core went up significantly by almost 20C from around 78C to 95C.

This was definitely disappointing as this affected the GPU clocks quite significantly and resulted in around 250MHz drop.

But because the hotspot went down this made me think that there just must be insufficient contact or cooler pressure so I was able to find some rubber washers or O-rings or whatever those are in a garage and I took off the retention plate and installed them. I tried to screw the plate back on as evenly as possible with just a normal screw driver and I hoped I wont crack the die by using too much pressure.

Results are absolutely stellar as I've got almost 20C drop on hot spot vs paste (around 9C improvement vs pad without washers) which makes my Time Spy max out at 91C on hot spot. Also the GPU temp went down by more than 20C vs pad without washers and around 5C lower than vs paste to around 73C in Time Spy.

So all in all I'm quite happy with the results. Washers probably did the most as I think doing washers + paste would get me similar results or maybe even better but I'm not going to try.

If you decide to go for the pad I recommend to get larger than 32x32mm one as I did as it's just big enough with almost no room for error if it moves during installation.

view more: next ›