this post was submitted on 08 Jul 2023
305 points (96.9% liked)

Technology

59605 readers
3494 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Slow June, people voting with their feet amid this AI craze, or something else?

you are viewing a single comment's thread
view the rest of the comments
[–] weeahnn@lemmy.world 9 points 1 year ago (3 children)

I still use it sometimes, but ohhh boy it can be a wreck. Like I've started using the Creation Kit for Bethesda games, and you can bet your ass that anything you ask it, you'll have to ask again. Countless times it's a back-and-forth of:

Me: Hey ChatGPT, how can I do this or where is this feature?

ChatGPT: Here is something that is either not relevant or just does not exist in the CK.

Me: Hey that's not right.

ChatGPT: Oh sorry, here's the thing you are looking for. and then it's still a 50-50 chance of it being real or fake.

Now I realize that the Creation Kit is kinda niche, and the info on it can be a pain to look up but it's still annoying to wade through all the shit that it's throwing in my direction.

With things that are a lot more popular, it's a lot better tho. (still not as good as some people want everyone to believe)

[–] cassetti@kbin.social 8 points 1 year ago

Lol, Chat has it's pros and cons. For helping me write or refine content, it's extremely helpful.

However I did try to use it to write code for me. I design 3D models using a programming language (OpenSCAD) and the results are hilarious. Literally it knows the syntax (kinda) and if I ask it to do something simple, it will essentially write the code for a general module (declaring key variables for the design), and then it calls a random module that doesn't exist (like it once called a module "lerp()" which is absolutely not a module) - this magical module mysteriously does 99% of the design..... but ChatGPT won't give it to me. When I ask it to write the code for lerp(), it gives me something random like this

module lerp() { splice(); }

Where it simply calls up a new module that absolutely does not exist. The results are hilarious, the code totally does not compile or work as intended. It is completely wrong.

But I think people are working it out of their system - some found novelty in it that wore off fast. Others like myself use it to help embellish product descriptions for ebay listings and such.

[–] american_defector@lemmy.world 6 points 1 year ago (3 children)

I’ve been building a tool that uses ChatGPT behind the scenes and have found that that’s just part of the process of building a prompt and getting the results you want. It also depends on which chat model is being used. If you’re super vague, it’s going to give you rubbish every time. If you go back and forth with it though, you can keep whittling it down to give you better material. If you’re generating content, you can even tell it what format and structure to give the information back in (I learned how to make it give me JSON and markdown only).

Additionally, you can give ChatGPT a description of what it’s role is alongside the prompt, if you’re using the API and have control of that kind of thing. I’ve found that can help shape the responses up nicely right out of the box.

ChatGPT is very, very much a “your mileage may vary” tool. It needs to be setup well at the start, but so many companies have haphazardly jumped on using it and they haven’t put in enough work prepping it.

[–] cassetti@kbin.social 4 points 1 year ago (1 children)

Have you see the JollyRoger Telco - they've started using ChatGPT to help have longer conversations with telemarketing scammers. I might actually re-subscribe to the jolly roger (used them previously) if the new updated bots perform as well enough.

[–] american_defector@lemmy.world 2 points 1 year ago

Lol that is brilliant use of it. I’ll have to check that out.

[–] seal_of_approval@sh.itjust.works 1 points 1 year ago (1 children)

If you don't mind me asking, does your tool programmatically do the "whittling down" process by talking to ChatGPT behind the scenes, or does the user still talk to it directly? The former seems like a powerful technique, though tricky to pull off in practice, so I'm curious if anyone has managed it.

[–] american_defector@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

Don’t mind at all! Yeah, it does a ton of the work behind the scenes. I essentially have a prompt I spent quite a bit of time iterating on. Then from there, what the user types gets sent bundled in with my prompt bootstrap. So it reduces the work considerably for the user and dials it in.

Edit: adding some more context/opinions.

I think the error that a lot of tools make is that they don’t spend enough time shaping their instructions for the AI. Sure, you can offload a lot of the work to it, but you have to write your own guard rails and instructions. You can tell it things like you would a human, and it will sometimes even fill in the gaps.

For example, I asked it to give me a data structure back that included an optional “title”. I found that if you left the title blank, ChatGPT took it upon itself to generate a title for you based on the content it wrote.

A lot of the things I got it to do took time and a ton of test iterations. I was even able to give it a list of exactly how it should structure the content it gave back. Things that I would otherwise do on the programming side, I was able to simply instruct ChatGPT to handle instead.

Ah, interesting. I myself have made my own library to create callable "prompt functions" that prompt the model and validate the JSON outputs, which ensures type-safety and easy integration with normal code.

Lately, I've shifted more towards transforming ChatGPT's outputs. By orchestrating multiple prompts and adding human influence, I can obtain responses that ChatGPT alone likely wouldn't have come up with. Though, this has to be balanced with giving it the freedom to pursue a different thought process.

[–] 80085@lemmy.world 1 points 1 year ago

What method did you use to generate only JSON? I'm using it (gpt3.5-turbo) in a prototype application, and even with giving it an example (one-shot prompting) and telling it to only output JSON, it sometimes gives me invalid results. I've read that the new function-calling feature is still not guaranteed to produce valid json. Microsoft's "guidance" (https://github.com/microsoft/guidance) looks like what I need, but I haven't got around to trying it yet.

[–] maiskanzler@feddit.de 4 points 1 year ago (1 children)

I recently asked it about Nix Flakes, which were very niche and bew during ChatGPTs Training. It was able to give me a reasonable answer in English, but if I first asked it in German, it couldn't do it. It could reasonably translate the english one though, after it generated that. Depending on what language you use to prompt it, you get very different answers, because it doesn't do the transfer of ideas and concepts between languages or more generally, disconnected bodies of text sources.

It is somewhat obvious if you know about the statistical nature of the models they use, but it's a great example of why these things don't KNOW things, they just regurgitate what they read in context before.

[–] MasterCelebrator@feddit.de 1 points 1 year ago

I agree. And i think it actually far from being "intelligent ". However it is a very helpful tool for many Tasks.