bankimu

joined 1 year ago
[–] bankimu@lemm.ee -2 points 1 year ago (1 children)

I mean yeah. I mean wtf.

I mean, if I install something compiling from source, I would not expect anyone else to manage it, right? I mean why would anyone expect that flatpak snap etc. all get managed automatically, they even forget how they installed something, it is so ridiculous.

[–] bankimu@lemm.ee -2 points 1 year ago (4 children)

Well I did, but the community in Lemmy is suffocatingly left leaning for me. So I had to go back to Reddit for fresh air. (Never have I thought I'd go back to Reddit because some other community is even more left, but here we are.)

[–] bankimu@lemm.ee 0 points 1 year ago (1 children)

Yeah. But if I ever want or need a Chromium browser, it may be the one.

 

As SSD is getting cheaper, I find myself keep thinking that I should buy a bigger one.

But this isn't really a decision based on actual need - and I can tell my mind is playing tricks to convince me that "you need to get it because what if you want to install a lot of large games".

I'm thinking may be a reasonable criteria instead is to see my current utilization, and buy one only if the utilization exceeds a percentage - say ">80% disk util=> buy a new one" as a rule?

What is your criteria to get a bigger disk?

 

As I play Diablo 4, I see it take up ridiculous amount of VRAM, upwards of 20GB.

The game doesn't have very good control over VRAM management, and I think it just doesn't free up any loaded texture if the VRAM size is large.

I use Lutris to run Battle.net client and Diablo.

Is there any way I can limit VRAM available to it from Wine, DXVK, or Lutris?

 

Are search engines able to index Lemmy?

I still keep seeing all search engines link to Reddit, but nothing from Lemmy is linked in organic search results.

I'm expecting big push towards Lemmy to happen when Google returns Lemmy links.

 

I have an RTX 2080ti.

I still play in 1080p 60Hz, and the 2080 is plenty. But I'm looking to train some ML models, and the 11GB VRAM is limiting for that.

Thus, I plan to buy a new one. Also I don't want a ML only GPU since I don't want to maintain two GPUs.

Since I'm upgrading, I need to think of future compatibility. At some point I will move to at least 2k, although still I'm not bought into 4k as any perceivable benefit.

Given all these, I wanted to check with folks who have either card, should I consider 4090?

 

AirBnB bans host for knowing the wrong people, without disclosing who the wrong person was, denies appeals with cookie cutter mails ("after careful consideration") every time.

view more: next ›