this post was submitted on 01 Oct 2023
1111 points (97.5% liked)

Technology

59605 readers
3575 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] j4k3@lemmy.world 19 points 1 year ago (2 children)

Organic technology is hard. If you can figure out how to grow a compute system you will take human technology hundreds of years into the future. Silicon tech is the stone age of compute.

The brain has a slow clock rate to keep within its power limitations, but it is a parallel computational beast compared to current models.

It takes around ten years for new hardware to really take shape in our current age. AI hasn't really established what direction it is going in yet. The open source offline model is the likely winner, meaning the hardware design and scaling factors are still unknown. We probably won't see a good solution for years. We are patching video hardware as a solution until AI specific hardware is readily available.

[–] Astroturfed@lemmy.world 2 points 1 year ago

I bet it'd be a whole lot easier to grow and organic computer if you didn't have to worry about pesky things like people thinking you grew genetically engineered slaves.

[–] ricdeh@lemmy.world 2 points 1 year ago

I am so excited for the advances that neuromorphic processors will bring, which is not exactly my field, but adjacent to it. The concept of modelling chips after the human brain instead of traditional computing doctrines sounds extremely promising, and I would love to get to work on systems like Intel's Loihi or IBM's TrueNorth! If you think about it, it's a bit ridiculous how corporations like Nvidia are currently approaching AI with graphics processors. I mean, it makes more sense than general-purpose CPUs, but it is at the very least a subideal solution.