this post was submitted on 05 Feb 2024
197 points (84.1% liked)

Asklemmy

43757 readers
2316 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Ok let's give a little bit of context. I will turn 40 yo in a couple of months and I'm a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write "good" code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don't sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I'm not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I'm sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive...

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] L0rdMathias@sh.itjust.works 2 points 9 months ago

It doesn't matter what you think about AI. It's very clear that this technology is here to stay and will only improve. From this point on AI will become deeply integrated into human culture and technology, after all we've been fetishizing it for almost 100 years now. Your only logical option as a developer is to learn how to use it and abuse it. Choosing not to do so is career suicide, possibly even societal suicide depending on how quickly adoption happens.

You're probably right, in the near future people that can't use it will be fired. To that point they should be fired. Why the fuck would I allow my accounts to do their financal work on paper when Excel exists?

Welcome to the future.

[–] Hestia@lemmy.world 1 points 9 months ago

I've been messing around with running my own LLMs at home using LM Studio and I've got so say it really helps me write code. I'm using Code Llama 13b, and it works pretty well as a programmer assistant. What I like about using a chatbot is that I go from writing code to reviewing it, and for some reason this keeps me incredibly mentally engaged. This tech has been wonderful for undoing some of my professional burnout.

If what keeps you mentally engaged does not include a bot, then I don't think you need any other reason to not use one. As much as I really like the tech, anyone that uses it is still going to need to know the language and enough about the libraries to fix the inevitable issues that come up. I can definitely see this tech getting better to the point of being unavoidable, though. You hear that Microsoft is planning on adding an AI button to their upcoming keyboards? Like that kind of unavoidable.

[–] eugenia@lemmy.ml 1 points 9 months ago (2 children)

I disagree with the other posts here that you're overreacting. I think that AI will replace most jobs (maybe as high as 85% at some point). Consider becoming a plumber or an electrician. Until the robots will become commonplace in 20 years from now, you will have a job that AI won't be able to touch much. And people won't run out of asses or gaming. So they'll be stable professions for quite a while. You can still code in your free time, as a hobby. And don't cry for the lost revenue of being a programmer, because that will happen to everyone who will be affected by AI. You'll just have another job while the others won't. That's the upside.

I understand that this comment is not what people want to hear with their wishful thinking, so they'll downvote it. But I gotta say it how I see it. AI is the biggest revolution since the industrial revolution.

[–] ParsnipWitch@feddit.de 2 points 9 months ago

With the difference that the industrial revolution created a lot of new jobs with better pay. While AI doesn't. I see people suggesting that this has happened before and soon it will turn the economic situation into something much better. But I don't see that at all. Just because it's also a huge revolution, doesn't mean it will have the same effects.

As you have written, people will have to switch into manual jobs like layering bricks and wiping butts. The pay in these jobs won't increase just because more people have to work them.

load more comments (1 replies)
[–] TheControlled@lemmy.world 1 points 9 months ago

πŸ™„ no I'm sure you're the only one

[–] coolin@beehaw.org 1 points 9 months ago

I think your job in your current form is likely in danger.

SOTA Foundation Models like GPT4 and Gemini Ultra can write code, execute, and debug with special chain of thought prompting techniques, and large acale process verification on synthetic data and RL search for correct outputs will make this 10x better. The silver lining to this is that I expect this to require an absolute shit ton of compute to constantly generate LLM output hundreds of times for each internal prompt over multiple prompts, requiring immense compute and possibly taking longer than an ordinary software engineer to run. I suspect early full stack developer LLMs will mainly be used to do a few very tedious coding tasks and SWEs will be cheaper for a fair length of time.

I expect it will be 2-3 years before this happens, so for that short period I expect workers to be "super-productive" by using LLMs in the coding process, but I expect the crossover point when the LLM becomes better is quite soon, perhaps in the next 5 years as compute requirements go down.

[–] BolexForSoup@kbin.social 1 points 9 months ago

To answer your question directly: The debate has been going on in the broader public since ChatGPT 3 dropped

To answer how you’re feeling: that’s valid, because a lot of big pockets seem to not care at all about the ethical considerations.

[–] spez_@lemmy.world 1 points 9 months ago
[–] werefreeatlast@lemmy.world 0 points 9 months ago

I love llms! I'm using them to answer all sorts of bullshit to become a manager....like here's a bunch of notes make me a managers review of Brian. LOL.

I think Google is struggling to control the bullshit flood from the Internet and so AI is about to eat their lunch. Like I already decided that all AIs are just bullshit and the only really useful AIs are the ones that can actually search the Internet live. Perplexity AI was doing this for a while but someone chopped off it's balls. I've been looking for a replacement ever since.

I also use it for help with python, with Linux, with docker, with solid works and stuff around the house like taxes, kombucha, identifying plants and stupid stuff like that.

But I can definitely see the future when the police are replaced with robo dogs with lases heads that can run at 120mph and shoot holes through cars. The only benefit being that the hole doesn't get infected and there's no pool of blood. That future is coming. I'm going to start wearing aluminum reflective shield armor.

load more comments
view more: β€Ή prev next β€Ί