this post was submitted on 23 Nov 2023
153 points (92.7% liked)

Technology

59174 readers
857 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] serialandmilk@lemmy.ml 7 points 11 months ago (2 children)

Many of the building blocks of computing come from complex abstractions built on top of less complex abstractions built on top of even simpler concepts in algebra and arithmetic. If Q* can pass middle school math, then building more abstractions can be a big leap.

Huge computing resources only seem ridiculous, unsustainable, and abstract until they aren't anymore. Like typing messages a bending glass screens for other people to read...

[–] Aceticon@lemmy.world 3 points 11 months ago (1 children)

The thing is, in general computing it was humans who figured out how to build the support for complex abstractions up from support for the simplest concepts, whilst this would have to not just support the simple concepts but actually figure out and build support for complex abstractions by itself to be GAI.

Training a neural network to do a simple task (such as addition) isn't all that hard (I get the impression that the "breaktrough" here is that they got an LLM - which is a very specific kind of NN, for language - to do it), getting it to by itself build support for complex abstractions from support for simpler concepts is something else altogether.

[–] ChrisLicht@lemm.ee 1 points 11 months ago

I know jack shit, but actual mastery of first principles would seem a massive leap in LLM development. A shift from talented bullshitter to deductive extrapolator does sound worthy of notice/concern.

[–] SkyeStarfall@lemmy.blahaj.zone 3 points 11 months ago

With middle school math you can fairly straightforwardly do math all the way to linear algebra. Calculus requires a bit of a leap, but this still leaves a lot of the math world available.