this post was submitted on 25 Feb 2024
184 points (83.8% liked)

Technology

59693 readers
5283 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path::Don't learn to code advises Jensen Huang of Nvidia. Thanks to AI everybody will soon become a capable programmer simply using human language.

top 50 comments
sorted by: hot top controversial new old
[–] Sibbo@sopuli.xyz 237 points 9 months ago (16 children)

Founder of company which makes major revenue by selling GPUs for machine learning says machine learning is good.

load more comments (16 replies)
[–] muntedcrocodile@lemmy.world 67 points 9 months ago (2 children)

I worry for the future generations that cant debug cos they dont know how to program and just use ai.

[–] NocturnalEngineer@lemmy.world 18 points 9 months ago

Don't worry, they'll have AI animated stick figures telling them what to do instead...

load more comments (1 replies)
[–] ThePowerOfGeek@lemmy.world 59 points 9 months ago (3 children)

Having used Chat GPT to try to find solutions to software development challenges, I don't think programmers will be at that much risk from AI for at least a decade.

Generative AI is great at many things, including assistance with basic software development tasks (like spinning up blueprints for unit tests). And it can be helpful filling in code gaps when provided with a very specific prompt... sometimes. But it is not great at figuring out the nuances of even mildly complex business logic.

[–] DacoTaco@lemmy.world 23 points 9 months ago* (last edited 9 months ago) (1 children)

This.
I got a github copilot subscription at work and its useful for suggesting code in small parts, but i would never let it decide what design pattern to use to tackle the problem we are solving. Once i know the solution i can use ai, and verify its output to use in the code

load more comments (1 replies)
[–] Schal330@lemmy.world 10 points 9 months ago (2 children)

I'm a junior dev that has been on the job for ~6 months. I found AI to be useful for learning when I had to make an application in Swift and had zero experience of the language. It presented me with some turd responses, but from this it gave me the idea of what to try and what to look into to find answers.

I find that sometimes AI can present a concept to me in a way I can understand, where blogs can fail. I'm not worried about AI right now, it's a tool to make our jobs easier!

load more comments (2 replies)
load more comments (1 replies)
[–] fidodo@lemmy.world 56 points 9 months ago (2 children)

As a developer building on top of LLMs, my advice is to learn programming architecture. There's a shit ton of work that needs to be done to get this unpredictable non deterministic tech to work safely and accurately. This is like saying get out of tech right before the Internet boom. The hardest part of programming isn't writing low level functions, it's architecting complex systems while keeping them robust, maintainable, and expandable. By the time an AI can do that, all office jobs are obsolete. AIs will be able to replace CEOs before they can replace system architects. Programmers won't go away, they'll just have less busywork to do and instead need to work at a higher level, but the complexity of those higher level requirements are about to explode and we will need LLMs to do the simpler tasks with our oversight to make sure it gets integrated correctly.

I also recommend still learning the fundamentals, just maybe not as deeply as you needed to. Knowing how things work under the hood still helps immensely with debugging and creating better more efficient architectures even at a high level.

I will say, I do know developers that specialized in algorithms who are feeling pretty lost right now, but they're perfectly capable of adapting their skills to the new paradigm, their issue is more of a personal issue of deciding what they want to do since they were passionate about algorithms.

load more comments (2 replies)
[–] Luisp@lemmy.dbzer0.com 48 points 9 months ago

Lmao do the opposite of whatever this guy says, he only wants his 2 trillion dollar stockmarket bubble not to burst

[–] eager_eagle@lemmy.world 47 points 9 months ago* (last edited 9 months ago) (2 children)

the day programming is fully automated, so will other jobs.

maybe it'd make more sense if he suggested to be a blue collar worker instead.

[–] Ghostalmedia@lemmy.world 20 points 9 months ago (4 children)

Human can probably still look forward to back breaking careers of manual labor that consist of complex varied movements!

load more comments (4 replies)
load more comments (1 replies)
[–] filister@lemmy.world 45 points 9 months ago* (last edited 9 months ago) (5 children)

You remember when everyone was predicting that we are a couple of years away from fully self-driving cars. I think we are now a full decade after those couple of years and I don't see any fully self driving car on the road taking over human drivers.

We are now at the honeymoon of the AI and I can only assume that there would be a huge downward correction of some AI stocks who are overvalued and overhyped, like NVIDIA. They are like crypto stock, now on the moon tomorrow, back to Earth.

[–] SlopppyEngineer@lemmy.world 18 points 9 months ago* (last edited 9 months ago)

Two decades. DARPA Grand Challenge was in 2004.

Yeah, everybody always forgets the hype cycle and the peak of inflated expectations.

[–] paf0@lemmy.world 10 points 9 months ago (1 children)

Waymo exists and is now moving passengers around in three major cities. It's not taking over yet, but it's here and growing.The timeframe didn't meet the hype but the technology is there.

[–] filister@lemmy.world 16 points 9 months ago* (last edited 9 months ago)

Yes, the technology is there but it is not Level 5, it is 3.5-4 at best.

The point with a full self-driving car is that complexity increases exponentially once you reach 98-99% and the last 1-2% are extremely difficult to crack, because there are so many corner cases and cases you can't really predict and you need to make a car that drives safer than humans if you really want to commercialize this service.

Same with generative AI, the leap at first was huge, but comparing GPT 3.5 to 4 or even 3 to 4 wasn't so great. And I can only assume that from now on achieving progress will get exponentially harder and it will require usage of different yet unknown algorithms and models and advances will be a lot more modest.

And I don't know for you but ChatGPT isn't 100% correct especially when asking more niche questions or sending more complex queries and often it hallucinates and sometimes those hallucinations sound extremely plausible.

load more comments (3 replies)
[–] Wooki@lemmy.world 45 points 9 months ago* (last edited 9 months ago) (3 children)

This overglorified snake oil salesman is scared.

Anyone who understands how these models works can see plain as day we have reached peak LLM. Its enshitifying on itself and we are seeing its decline in real time with quality of generated content. Dont believe me? Go follow some senior engineers.

[–] Michal@programming.dev 19 points 9 months ago (2 children)

Any recommendations whom to follow? On Mastodon?

[–] thirteene@lemmy.world 11 points 9 months ago (5 children)

There is a reason they didn't offer specific examples. LLM can still scale by size, logical optimization, training optimization, and more importantly integration. The current implementation is reaching it's limits but pace of growth is also happening very quickly. AI reduces workload, but it is likely going to require designers and validators for a long time.

load more comments (5 replies)
load more comments (1 replies)
load more comments (2 replies)
[–] kescusay@lemmy.world 43 points 9 months ago (1 children)

Well. That's stupid.

Large language models are amazingly useful coding tools. They help developers write code more quickly.

They are nowhere near being able to actually replace developers. They can't know when their code doesn't make sense (which is frequently). They can't know where to integrate new code into an existing application. They can't debug themselves.

Try to replace developers with an MBA using a large language model AI, and once the MBA fails, you'll be hiring developers again - if your business still exists.

Every few years, something comes along that makes bean counters who are desperate to cut costs, and scammers who are desperate for a few bucks, declare that programming is over. Code will self-write! No-code editors will replace developers! LLMs can do it all!

No. No, they can't. They're just another tool in the developer toolbox.

[–] paf0@lemmy.world 11 points 9 months ago (8 children)

I've been a developer for over 20 years and when I see Autogen generate code, decide to execute that code and then fix errors by making a decision to install dependencies, I can tell you I'm concerned. LLMs are a tool, but a tool that might evolve to replace us. I expect a lot of software roles in ten years to look more like an MBA that has the ability to orchestrate AI agents to complete a task. Coding skills will still matter, but not as much as soft skills will.

load more comments (8 replies)
[–] madcaesar@lemmy.world 34 points 9 months ago

This seems as wise as Bill Gates claiming 4MB of ram is all you'll ever need back on 98 🙄

[–] gornius@lemmy.world 31 points 9 months ago (3 children)

It's just as crazy as saying "We don't need math, because every problem can be described using human language".

In other words, that might be true as long as your problem is not complex enough to be able to be understood using human language.

You want to solve a real problem? It's way more complex with so many moving parts you can't just take LLM to solve it, because that takes an actual understanding of a problem.

[–] trolololol@lemmy.world 10 points 9 months ago (1 children)

Ha

If you ever write code for a living first thing you notice is that people can't explain what they need by using natural language ( which is what English, Mandarin etc is), even if they don't need to get into details.

load more comments (1 replies)
load more comments (2 replies)
[–] rottingleaf@lemmy.zip 30 points 9 months ago (2 children)

I think this is bullshit regarding LLMs, but making and using generative tools more and more high-level and understandable for users is a good thing.

Like various visual programming means, where you sketch something working via connected blocks (like PureData for sounds), or in Matlab I think one can use such constructors to generate code for specific controllers involved in the scheme, or like LabView.

Or like HyperCard.

Not that anybody should stop learning anything. There's a niche for every way to do things.

I just like that class of programs.

load more comments (2 replies)
[–] AMDIsOurLord@lemmy.ml 29 points 9 months ago (1 children)

Jensen fucking Huang is a piece of shit and choke full of it too

Actually, AI can replace this dick at a fraction of the cost instead of replacing developers. Bring out the guillotine mfs

[–] gaifux@lemmy.world 18 points 9 months ago (1 children)

Your vulgarity and call to violence are quite convincing, sir. Mayhaps you moonlight as a bard?

load more comments (1 replies)
[–] Blackmist@feddit.uk 29 points 9 months ago (2 children)

I don't think he's seen the absolute fucking drivel that most developers have been given as software specs before now.

Most people don't even know what they want, let alone be able to describe it. I've often been given a mountain of stuff, only to go back and forth with the customer to figure out what problem they're actually trying to solve, and then do it in like 3 lines of code in a way that doesn't break everything else, or tie a maintenance albatross around my neck for the next ten years.

load more comments (2 replies)
[–] howrar@lemmy.ca 29 points 9 months ago (5 children)

I don't see how it would be possible to completely replace programmers. The reason we have programming languages instead of using natural language is that the latter has ambiguities. If you start having to describe your software's behaviour in natural language, then one of three things can happen:

  1. either this new natural programming language has to make assumptions about what you intend, and thus will only be capable of outputting a certain class of software (i.e. you can't actually create anything new),
  2. or you need to learn a new way of describing things unambiguously, and now you're back to programming but with a new language,
  3. or you spend forever going back and forth with the generator until it gives you the output you want, and this would take a lot longer to do than just having an experienced programmer write it.
[–] ReplicaFox@lemmy.world 14 points 9 months ago

And if you don't know how to code, how do you even know if it gave you the output you want until it fails in production?

load more comments (4 replies)
[–] Evotech@lemmy.world 25 points 9 months ago (1 children)
[–] SuckMyWang@lemmy.world 16 points 9 months ago

Why would he lie? Other than to pump the companies shares

[–] JeeBaiChow@lemmy.world 22 points 9 months ago

I mean why have a CS degree when an AI subscription costs $30/month?

/s

[–] realharo@lemm.ee 20 points 9 months ago

I can kind of see his point, but the things he is suggesting instead (biology, chemistry, finance) don't make sense for several reasons.

Besides the obvious "why couldn't AI just replace those people too" (even though it may take an extra few years), there is also a question of how many people can actually have a deep enough expertise to make meaningful contributions there - if we're talking about a massive increase of the amount of people going into those fields.

[–] zoltraak@lemm.ee 15 points 9 months ago

After using co pilot and other AI code tools it's obvious to see the limitations of it, programming is a lot more than just writing "ok" code

[–] swayevenly@lemm.ee 14 points 9 months ago (1 children)

I think the Jensen quote loosley implies we don't need to learn a programming language but the logic was flimsy. Same goes for the author as they backtrack a few times. Not a great article in my opinion.

[–] DudeDudenson@lemmings.world 24 points 9 months ago* (last edited 9 months ago)

Jensen's just trying to ride the AI bubble as far as it'll go, next he'll tell you to forget about driving or studying

[–] RagingSnarkasm@lemmy.world 13 points 9 months ago (1 children)

There’s good money to be made in selling leather jackets.

load more comments (1 replies)
[–] 3volver@lemmy.world 12 points 9 months ago

Don't tell me what to do. Going to spend more time learning to code from now on, thanks.

[–] OleoSaccharum@lemm.ee 11 points 9 months ago (16 children)

Nvidia is such a stupid fucking company. It's just slapping different designs onto TSMC chips. All our "chip companies" are like this. In the long run they are all going to get smoked. I won't tell you by whom. You shouldn't need a reminder.

load more comments (16 replies)
[–] Modern_medicine_isnt@lemmy.world 10 points 9 months ago (1 children)

It's not really about the coding, it's about the process of solving the problem. And ai is very far away from being able to do that. The language you learn to code in is probably not the one you will use much of you life. It will just get replaced by which ai you will use to code.

[–] Dkarma@lemmy.world 10 points 9 months ago

Yep. The best guy on my team isn't the best coder. He's the best at visualizing the complete solution and seeing pinch points in his head.

[–] JeeBaiChow@lemmy.world 10 points 9 months ago* (last edited 9 months ago) (1 children)

So TIL 'prompt engineer' is now a thing. We're doomed, aren't we?

load more comments (1 replies)
load more comments
view more: next ›