this post was submitted on 27 Jun 2024
955 points (98.0% liked)

Programmer Humor

18237 readers
1900 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 

Cross posted from: https://lemm.ee/post/35627632

top 50 comments
sorted by: hot top controversial new old
[–] Snapz@lemmy.world 22 points 2 days ago (1 children)

Except AI doesn't say "Is this it?"

It says, "This is it."

Without hesitation and while showing you a picture of a dog labeled cat.

[–] werefreeatlast@lemmy.world 1 points 2 days ago (1 children)

I have vivid examples of how bad AI is a programming.

[–] Passerby6497@lemmy.world 5 points 2 days ago

My favorite is when it just keeps giving you the exact same answer you keep telling it is wrong

[–] Ziglin@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

My new favourite is asking GitHub copilot (which I would not pay for out of my own pocket) why the code I'm writing isn't working as intended and it asks me to show it the code that I already provided.

I do like not having copy and paste the same thing 5 times with slight variations (something it usually does pretty well until it doesn't and I need a few minutes to find the error)

[–] Blackmist@feddit.uk 13 points 3 days ago (1 children)

I guess whether it's worth it depends on whether you hate writing code or reading code the most.

[–] anakin78z@lemmy.world 15 points 2 days ago (4 children)

Is there anyone who likes reading code more than writing it?

load more comments (4 replies)
[–] bjoern_tantau@swg-empire.de 90 points 4 days ago (9 children)

Yeah, in the time I describe the problem to the AI I could program it myself.

[–] takeda@lemmy.world 49 points 4 days ago (2 children)

This is what it is called a programming language, it only exists to be able to tell the machine what to do in an unambiguous (in contrast to natural language) way.

[–] catastrophicblues@lemmy.ca 15 points 3 days ago (1 children)

Ugh I can’t find the xkcd about this where the guy goes, “you know what we call precisely written requirements? Code” or something like that

[–] abcd@feddit.de 11 points 3 days ago (1 children)

This reminds me of a colleague who was always ranting that our code was not documented well enough. He did not understand that documenting code in easily understandable sentences for everybody would fill whole books and that a normal person would not be able to keep the code path in his mental stack while reading page after page. Then he wanted at least the shortest possible summary of the code, which of course is the code itself.

The guy basically did not want to read the code to understand the logic behind. When I took an hour and literally read the code for him and explained what I was reading including the well placed comments here and there everything was clear.

AI is like this in my opinion. Some guys waste hours to generate code they can’t debug for days because they don’t understand what they read, while it would take maybe two hours to think and a day to implement and test to get the job done.

I don’t like this trend. It’s like the people that can’t read docs or texts anymore. They need some random person making a 43 minute YouTube video to write code they don’t understand. Taking shortcuts in life usually never goes well in the long run. You have to learn and refine your skills each and every day to be and stay competent.

AI is a tool in our toolbox. You can use it to be more productive. And that’s it.

load more comments (1 replies)
[–] IronKrill@lemmy.ca 6 points 3 days ago (1 children)

I have a bad habit of jumping into programming without a solid plan which results in lots of rewrites and wasted time. Funnily enough, describing to an AI how I want the code to work forces me to lay out a basic plan and get my thoughts in order which helps me make the final product immensely easier.

This doesn't require AI, it just gave me an excuse to do it as a solo actor. I should really do it for more problems because I can wrap my head better thinking in human readable terms rather than thinking about what programming method to use.

load more comments (1 replies)
load more comments (7 replies)
[–] MystikIncarnate@lemmy.ca 40 points 3 days ago (13 children)

AI in the current state of technology will not and cannot replace understanding the system and writing logical and working code.

GenAI should be used to get a start on whatever you're doing, but shouldn't be taken beyond that.

Treat it like a psychopathic boiler plate.

[–] CanadaPlus@lemmy.sdf.org 14 points 3 days ago* (last edited 3 days ago) (5 children)

Treat it like a psychopathic boiler plate.

That's a perfect description, actually. People debate how smart it is - and I'm in the "plenty" camp - but it is psychopathic. It doesn't care about truth, morality or basic sanity; it craves only to generate standard, human-looking text. Because that's all it was trained for.

Nobody really knows how to train it to care about the things we do, even approximately. If somebody makes GAI soon, it will be by solving that problem.

[–] MacNCheezus@lemmy.today 2 points 2 days ago (1 children)

Weird. Are you saying that training an intelligent system using reinforcement learning through intensive punishment/reward cycles produces psychopathy?

Absolutely shocking. No one could have seen this coming.

[–] CanadaPlus@lemmy.sdf.org 2 points 2 days ago* (last edited 2 days ago) (2 children)

Honestly, I worry that it's conscious enough that it's cruel to train it. How would we know? That's a lot of parameters and they're almost all mysterious.

load more comments (2 replies)
load more comments (4 replies)
load more comments (12 replies)
[–] Korne127@lemmy.world 59 points 4 days ago (3 children)

In my experience, you can't expect it to deliver great working code, but it can always point you in the right direction.
There were some situations in which I just had no idea on how to do something, and it pointed me to the right library. The code itself was flawed, but with this information, I could use the library documentation and get it to work.

[–] xia@lemmy.sdf.org 12 points 3 days ago (1 children)

It can point you in a direction, for sure, but sometimes you find out much later that it's a dead-end.

load more comments (1 replies)
[–] danc4498@lemmy.world 5 points 3 days ago* (last edited 3 days ago)

It’s the same with using LLM’s for writing. It won’t deliver a finished product, but it will give you ideas that can be used in the final product.

load more comments (1 replies)
[–] xia@lemmy.sdf.org 41 points 3 days ago (1 children)

This is the experience of a senior developer using genai. A junior or non-dev might not leave the "AI is magic" high until they have a repo full of garbage that doesn't work.

[–] jaybone@lemmy.world 16 points 3 days ago (3 children)

This was happening before this “AI” craze.

load more comments (3 replies)
[–] NegativeLookBehind@lemmy.world 18 points 3 days ago (1 children)

When I used to try and ask AI for help, most of the time it would just give me fake command combinations or reference some made-up documentation

[–] Drewelite@lemmynsfw.com 2 points 2 days ago* (last edited 2 days ago) (1 children)

The best one I've used for coding is the InelliJ AI. Idk how they trained that sucker but it's pretty good at ripping through boiler plate code and structuring new files / methods based off how your project is already setup. It still has those little hallucinations especially when you ask it to figure out more niche tasks. But It's really increased my productivity. Especially when getting a new repo setup. (I work with micro services)

load more comments (1 replies)
[–] RobotZap10000@feddit.nl 28 points 4 days ago (1 children)

Why is the AI speaking in a bisexual gradient?

[–] BoneALisa@lemm.ee 16 points 4 days ago (1 children)

Its the "new hype tech product background" gradient lol

[–] Fleur__@lemmy.world 8 points 3 days ago (3 children)

Because all robots are bisexual

load more comments (3 replies)
[–] crossmr@kbin.run 32 points 4 days ago (3 children)

Gen AI is best used with languages that you don't use that much. I might need a python script once a year or once every 6 months. Yeah I learned it ages ago, but don't have much need to keep up on it. Still remember all the concepts so I can take the time to describe to the AI what I need step by step and verify each iteration. This way if it does make a mistake at some point that it can't get itself out of, you've at least got a script complete to that point.

[–] RestrictedAccount@lemmy.world 12 points 4 days ago

Exactly. I can’t remember syntax for all the languages that I have used over the last 40 years, but AI can get me started with a pretty good start and it takes hours off of the review of code books.

load more comments (2 replies)
[–] jaybone@lemmy.world 14 points 3 days ago (2 children)

It’s almost like working with shitty engineers.

[–] Knock_Knock_Lemmy_In@lemmy.world 7 points 3 days ago* (last edited 3 days ago)

Shitty engineers that can do grunt work, don't complain, don't get distracted and are great at doing 90% of the documentation.

But yes. Still shitty engineers.

Great management consultants though.

load more comments (1 replies)
[–] Deceptichum@sh.itjust.works 10 points 4 days ago (6 children)

So what it’s really like is only having to do half the work?

Sounds good, reduced workload without some unrealistic expectation of computers doing everything for you.

[–] takeda@lemmy.world 8 points 4 days ago

From my experience all the time (probably even more) it saves me is wasted on spotting bugs and the bugs are in very subtle places.

load more comments (5 replies)
[–] onlinepersona@programming.dev 7 points 4 days ago

Code is the most in depth spec one can provide. Maybe someday we'll be able to iterate just by verbally communicating and saying "no like this", but it doesn't seem like we're quite there yet. But also, will that be productive?

Anti Commercial-AI license

load more comments
view more: next ›