this post was submitted on 05 Feb 2024
113 points (91.9% liked)
Programming
17333 readers
150 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There is a lot of fake progress. In computer technology some things were refined, but the only true technological novelty these last 20 years was the containerization. And maybe AI. Internet was the previous jump, but it's not really a computer technology, and it affect much, much more than that.
And Moor law has already ended some years ago.
20 years ago 32-bit systems, CRT monitors, dial-up modems, single core processors or HDDs with most people having 160 GB of storage at most were common. And laptop battery life and thermal performance was just ridiculous in most cases.
Moore's law is mostly dead for commercial crap, i.e. JS-heavy 3rd party spyware filled websites with comparably slow and costly backends and Electron/React Native bloat on desktop/mobile, because shorter time to market and thus paying the devs less is often much cheaper for a lot of companies.
I'd argue free software luckily proves this theorem wrong. There are still a lot of actively maintained, popular programs in C and C++ and a lot of newer ones written in Rust, Dart or Go.
Moor law is dead for a few years now. It's a fact. It doesn't mean performances stoped increasing. But they don't follow the old law. That's why the industry is shifting to distributed networking.
Clock speed and other areas I'd agree have stagnated, but graphics cards, wireless communicaiton standards, cheap fast SSD's, and power efficient CPU's have massively impacted end-user performance in the last 10 years. RISC-V is also a major development that is just getting started.
None of those are major breakthrough. They're more computing power. It's still the same technology.
Today llm are the prime candidate for a breakthrough. They still have to prove themselves though, to prove that they're not just a fancy expensive useless toy like the blockchain.
Risc-v is not meant to be a breakthrough. It's an evolution.
Internet was a breakthrough. The invention of the mouse was a breakthrough.
Increase in power or in disk space, new languages or os, none of those are breakthroughs. None of those changed how computer programs were made or used.
The smartphone is a significant thing. Wi-Fi is not really important though, because you don't do anything more with WiFi than you can do with ethernet. The smartphone though and its network, that is a big thing.
Sure not a breakthrough, but they are "real" progress not fake progress (which is what I was responding to in your earlier comment)