this post was submitted on 05 Oct 2024
148 points (88.1% liked)
Technology
59288 readers
5253 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sure, but also no.
More’s law is at the most fundamental level a observation about the exponential curve of technological progress.
It was originally about semiconductor transistors and that is what Moore was specifically looking at but the observed pattern does 100% apply to other things.
In modern language the way language is used and perceived determines its meaning and not its origins.
So we should start calling monitors computers, desktop towers modems (or CPUs (or hard drives)), wifi as internet, browsers as search engines and search engines as browsers. None of this is incorrect, according to the average person.
No. Let me reiterate:
Moore's Law was an observation that semiconductor transistor density roughly doubles every ~2 years.
It is not about technological progress in general. That's just how the term gets incorrectly applied by a small subsect of people online who want to sound like they're being technical.
Moore's Law is what I described above. It is not "technology gets better".
I meant that sentence quite literally, semiconductor is technology. My perspective is that original “moors law” is only a single example of what many people will understand when they hear the term in a modern context.
At some point where debating semantics and those are subjective, local and sometimes cultural. Preferable i avoid spending energy on fighting about such.
Instead il provide my own line of thinking towards a fo me valid reason of the term outside semiconductors. I am open to suggestions if there is better language.
From my own understanding i observe a pattern where technology (mostly digital technology but this could be exposure bias) gets improving at an increasingly fast rate. The mathematical term is exponential.
To me seeing such pattern is vital to understand whats going on. Humans are not designed to extrapolate exponential curves. A good example is AI, which large still sucks today but the history numbers don't lie on the potential.
I have a rather convoluted way of speaking, its very unpractical.
Language,at best, should just get the message across. In an effective manner.
I envoke (reference) moores law to refer to the observation of exponential progress. Usually this gets my point across very effectively (not like such comes up often in my everyday life)
To me, moors law in semiconductors is the first and original example of the pattern. The fact that this interpretation is subjective has never been relevant to getting my point across.
This is technically correct but misleading in this context, given that it falsely implies that the original meaning (doubling transistor density every 2y) became obsolete. It did not. Please take context into account. Please.
Furthermore you're missing the point. The other comment is not just picking on words, but highlighting that people bring "it's Moore's Law" to babble inane predictions about the future. That's doubly true when people assume (i.e. make shit up) that "doubling every 2y" applies to other things, and/or that it's predictive in nature instead of just o9bservational. Cue to the OP.
(this is a lil' lemmy thread and I think everyone understands what OP had in mind)
Sure, if you retroactively go back and look for patterns where it matches something but that isn't a very useful exercise.