this post was submitted on 07 Jul 2023
43 points (100.0% liked)

Programming

13371 readers
1 users here now

All things programming and coding related. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 1 year ago
MODERATORS
 

I’m a dev. I’ve been for a while. My boss does a lot technology watch. He brings in a lot of cool ideas and information. He’s down to earth. Cool guy. I like him, but he’s now convinced that AI LLMs are about to swallow the world and the pressure to inject this stuff everywhere in our org is driving me nuts.

I enjoy every part of making software, from discussing with the clients and the future users to coding to deployment. I am NOT excited at the prospect of transitioning from designing an architecture and coding it to ChatGPT prompting. This sort of black box magic irks me to no end. Nobody understands it! I don’t want to read yet another article about how an AI enthusiast is baffled at how good an LLM is at coding. Why are they baffled? They have "AI" twelves times in their bio! If they don’t understand it who does?!

I’ve based twenty years of career on being attentive, inquisitive, creative and thorough. By now, in-depth understanding of my tools and more importantly of my work is basically an urge.

Maybe I’m just feeling threatened, or turning into "old man yells at cloud". If you ask me I’m mostly worried about my field becoming uninteresting. Anyways, that was the rant. TGIF, tomorrow I touch grass.

you are viewing a single comment's thread
view the rest of the comments
[–] MagicShel@programming.dev 16 points 1 year ago (4 children)

Having an AI help you code is like having a junior developer who is blazing fast, enthusiastic and listens well. However it doesn’t think about what it writes. It does no testing and it doesn’t understand the big picture at all. For very simple tasks, it gets the job done very fast, but for complex tasks no matter how many times you explain it. It is never going to get it. I don’t think there’s any worry about AI replacing developers any time in the foreseeable future.

[–] shadowolf@lemmy.ca 3 points 1 year ago (2 children)

in fairness.. this is more a limitation of the current technology. your look at gpt4 and going not an expert. but what about gpt5 or 6.. or some of the newer ideas like microsoft plan for 1 million token model using attention dialation mechnism. The point being we are still on the ground floor. And these models have emgerent functionality

[–] MagicShel@programming.dev 4 points 1 year ago

That’s out of the scope of the foreseeable future though. It’s speculating on the capabilities of things that are completely theoretical at this point. It’ll be interesting to see what happens. I’m not holding my breath.

[–] hallettj@beehaw.org 2 points 1 year ago

Lol this is what I was thinking too. The junior dev is also a black box. AI automation seems more like delegating than programming to me.

[–] mkhoury@lemmy.ca 2 points 1 year ago

But you can work with it to write all the tests/acceptance criteria and then have the AI run the code against the tests. We spent a lot of time developing processes for humans writing code, we need to continue integrating the machines into these processes. It might not do 100% of the work you're currently doing, but it could do maybe 50% reliably. That's still pretty disruptive!

[–] Naate@beehaw.org 1 points 1 year ago

This is a pretty apt analogy, I think.

We've been using copilot at work, and it's really surprised me with some slick suggestions that "mostly work". But I don't think it could have written anything beyond the boilerplate my team has done.

(I also spend way too much time watching Copilot and Intellisense fight, and it pisses me off to no end.)