this post was submitted on 05 Feb 2024
197 points (84.1% liked)

Asklemmy

44152 readers
1718 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Ok let's give a little bit of context. I will turn 40 yo in a couple of months and I'm a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write "good" code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don't sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I'm not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I'm sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive...

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] z00s@lemmy.world 6 points 10 months ago

It won't replace coders as such. There will be devs who use AI to help them be more productive, and there will be unemployed devs.

[–] tunetardis@lemmy.ca 6 points 10 months ago (4 children)

As a fellow C++ developer, I get the sense that ours is a community with a lot of specialization that may be a bit more difficult to automate out of existence than web designers or what have you? There's just not as large a sample base to train AIs on. My C++ projects have ranged from scientific modelling to my current task of writing drivers for custom instrumentation we're building at work. If an AI could interface with the OS I wrote from scratch for said instrumentation, I would be rather surprised? Of course, the flip side to job security through obscurity is that you may make yourself unemployable by becoming overly specialized? So there's that.

load more comments (4 replies)
[–] ParsnipWitch@feddit.de 6 points 10 months ago

Your fear is in so far justified as that some employers will definitely aim to reduce their workforce by implementing AI workflow.

When you have worked for the same employer all this time, perhaps you don't know, but a lot of employers do not give two shits about code quality. They want cheap and fast labour and having less people churning out more is a good thing in their eyes, regardless of (long-term) quality. May sound cynical, but that is my experience.

My prediction is that the income gap will increase dramatically because good pay will be reserved for the truly exceptional few. While the rest will be confronted with yet another tool capitalists will use to increase profits.

Maybe very far down the line there is blissful utopia where no one has to work anymore. But between then and now, AI would have to get a lot better. Until then it will be mainly used by corporations to justify hiring less people.

[–] purpleprophy@feddit.uk 6 points 10 months ago (1 children)

This might cheer you up: https://visualstudiomagazine.com/articles/2024/01/25/copilot-research.aspx

I don't think we have anything to worry about just yet. LLMs are nothing but well-trained parrots. They can't analyse problems or have intuitions about what will work for your particular situation. They'll either give you something general copied and pasted from elsewhere or spin you a yarn that sounds plausible but doesn't stand up to scrutiny.

Getting an AI to produce functional large-scale software requires someone to explain precisely the problem domain: each requirement, business rule, edge case, etc. At which point that person is basically a developer, because I've never met a project manager who thinks that granularly.

They could be good for generating boilerplate, inserting well-known algorithms, generating models from metadata, that sort of grunt work. I certainly wouldn't trust them with business logic.

load more comments (1 replies)
[–] olbaidiablo@lemmy.ca 6 points 10 months ago

AI allows us to do more with less just like any other tool. It's no different than an electric drill or a powered saw. Perhaps in the future we will see more immersive environment games because much of the immersive environment can be made with AI doing the grunt work.

[–] howrar@lemmy.ca 5 points 10 months ago* (last edited 10 months ago)

If your job truly is in danger, then not touching AI tools isn't going to change that. The best you can do for yourself is to explore what these tools can do for you and figure out if they can help you become more productive so that you're not first on the chopping block. Maybe in doing so, you'll find other aspects of programming that you enjoy just as much and don't yet get automated away with these tools. Or maybe you'll find that they'll not all they're hyped up to be and ease your worry.

[–] Lath@kbin.social 5 points 10 months ago

If you are, it should be due to working for the wrong people. Those that don't understand what's what and only seek profit religiously.

Thanks for the readable code though.

[–] arthur@lemmy.zip 5 points 10 months ago

Man, it's a tool. It will change things for us, it is very powerful; but still a tool. It does not "know" anything, there's no true intelligence in the things we now call "AI". For now, is really useful as a rubber duck, it can make interesting suggestions, make you explore big code bases faster, and even be useful for creating boilerplate. But the code it generates usually is not very trustworthy and have lower quality.

The reality is not that we will lose our jobs to it, but that companies will expect more productivity from us using these tools. I recommend you to try ChatGPT (the best in class for now), and try to understand it's strengths and limitations.

Remember: this is just an autocomplete on steroids, that do more the the regular version, but that get the same type of errors.

[–] CanadaPlus@lemmy.sdf.org 4 points 10 months ago

Give Copilot or similar a try. AI or similar is pretty garbage at the more complex aspects of programming, but it's great at simple boilerplate code. At least for me, that doesn't seem like much of a loss.

[–] ulkesh@beehaw.org 4 points 10 months ago* (last edited 10 months ago) (1 children)

I’m less worried and disturbed by the current thing people are calling AI than I am of the fact that every company seems to be jumping on the bandwagon and have zero idea how it can and should be applied to their business.

Companies are going to waste money on it, drive up costs, and make the consumer pay for it, causing even more unnecessary inflation.

As for your points on job security β€” your trepidation is valid, but premature, by numerous decades, in my opinion. The moment companies start relying on these LLMs to do their programming for them is the moment they will inevitably end up with countless bugs and no one smart enough to fix them, including the so-called AI. LLMs seem interesting and useful on the surface, and a person can show many examples of this, but at the end of the day, it’s regurgitating fed content based on rules and measures with knob-tuning β€” I do not yet see objective strong evidence that it can effectively replace a senior developer.

[–] knightly@pawb.social 3 points 10 months ago

Companies are going to waste money on it, drive up costs, and make the consumer pay for it, causing even more unnecessary inflation.

The "AI" bubble will burst this year, I'd put money on it if I had any.

The last time we saw a bubble like this was "Web3" and we all know how that turned out.

[–] bruhduh@lemmy.world 3 points 10 months ago

Imagine it's like having intern under you that helping you with everything, quality of the code will still be on you regardless

[–] Damage@feddit.it 3 points 10 months ago

If this follows the path of the industrial revolution, it'll get way worse before it gets better, and not without a bunch of bloodshed

[–] FaceDeer@kbin.social 3 points 10 months ago (5 children)

I'm in a similar place to you career-wise. Personally, I'm not concerned about becoming just a "debugger." What I'm expecting this job to look like in a few years is to be more like "the same as now, except I've got a completely free team of "interns" that do all the menial stuff for me. Every human programmer will become a lead programmer, deciding what stuff our AIs do for us and putting it all together into the finished product.

Maybe a few years further along the AI assistants will be good enough to handle that stuff better than we do as well. At that point we stop being lead programmers and we all become programming directors.

So think of it like a promotion, perhaps.

load more comments (5 replies)
[–] yogthos@lemmy.ml 3 points 10 months ago

I'm not really losing any sleep over this myself. Current approach to machine learning is really no different from a Markov chain. The model doesn't have any understanding in a meaningful sense. It just knows that certain tokens tend to follow certain other tokens, and when you have a really big token space, then it produces impressive looking results.

However, a big part of the job is understanding what the actual business requirements are, translating those to logical steps, and then code. This part of the job can't be replaced until we figure out AGI, and we're nowhere close to doing that right now.

I do think that the nature of work will change, I kind of look at it as sort of doing a pair programming session. You can focus on what the logic is doing, and the model can focus on writing the boilerplate for you.

As this tech matures, I do expect that it will result in less workers being needed to do the same amount of work, and the nature of the job will likely shift towards being closer to a business analyst where the human focuses more on the semantics rather than implementation details.

We might also see new types of languages emerge that leverage the models. For example, I can see a language that allows you to declaratively write a specification for the code, and to encode constraints such as memory usage and runtime complexity. Then the model can bang its head against the spec until it produces code that passes it. If it can run through thousands of solutions in a few minutes, it's still going to be faster than a human coming up with one.

[–] l0st_scr1b3@beehaw.org 3 points 10 months ago (1 children)
load more comments (1 replies)
[–] dependencyinjection@discuss.tchncs.de 3 points 10 months ago (1 children)

Our company uses AI tools as just that, tools to help us do the job without having to do the boring stuff.

Like I can now just write a comment about state for a modal and it will auto generate the repetitive code of me having to write const [isModalOpen, setIsModalOpen] = useState(false);.

Or if I write something in one file it can reason that I am going to be using it in the next file so it can generate the code I would usually type. I still have to solve problems it’s just I can do it quicker now.

[–] cosmicrookie@lemmy.world 4 points 10 months ago

But thisbis OPs point. People are getting fired from tech companies because they don't need as many people any more. Work is being done faster and cheaper by using AI.

[–] AlteredStateBlob@kbin.social 2 points 10 months ago

I am om the product side of things and have created some basic proof of concept tools with AI that my bosses wanted to sell off. No way no how will I be able to sevrice or maintain them. It's incredibly impressive that I could even get this output.

I am not saying it won't become possible, but I lack the fundamental knowledge and understanding to make anything beyond the most minor adjustments and AI is still wuite bad at only addressing specific issues or, good forbid, expanding code, without fully rewriting the whole thing and breaking everything else.

For our devs I see it as a much improved and less snide stackoverflow and Google. The direct conversational nature really speeds things up with boilerplate code and since they actually know what they are doing, it's amazing. Not only that but we had devs copy paste from online searches withoout fully understanding the snippets. Now the AI can explain it in context.

load more comments
view more: β€Ή prev next β€Ί