this post was submitted on 11 Jul 2023
45 points (100.0% liked)
Technology
37713 readers
425 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What a super weird question. "Cloud computing" is distributed computing. Distributed computing is practically all we have left. Bitcoin/crypto, Kubernetes, Bit Torrent, and endless AWS/Cloud infra patterns. Then we have our happy little Fediverse here.
I feel the author was trying to say "is at home distributed computing dying?" In which case, yes, because Mobile took over and you really can't do background compute on those. Certainly not like how SETI@Home worked.
Aside from personal websites and maybe some Lemmy instances, I can't think of a single application that's NOT using distributed computing. Hell, Lemmy as a concept is still distributed computing even if individual instances aren't necessarily.
Lemmy is.... Not distributed computing.
If each instance is a separate application than must scale on it's own, then no distributed computing is occuring.
There is one database, and you can have the instance itself behind a load balancer.
Lemmy is not a distributed program, you can't scale it linearly by adding more nodes. It's severely limited by it's database access patterns, to a single DB, and is not capable of being distributed in it's current state. You can put more web servers behind a load balancer, but that's not really "distributed computing" that's just "distributing a workload", which has a lot of limitations that defeat it being truly distributed.
Actual distributed applications are incredibly difficult to create at scale, with many faux-distribited applications being made (Lemmy being n-tier im a per instance basis).
Think of Kafka. Kafka is an actual distributed application.