this post was submitted on 04 Jan 2024
358 points (97.6% liked)
Technology
59569 readers
4416 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And that was the bait and switch of OpenAI! They sold themselves as being a non-profit simply doing research, for which it would be perfectly legal to consume and reproduce large quantities of data... And then, once they had the data, they started selling access to it.
I would say that that alone, along with the fact that they function as gatekeepers to the technology (One does not simply purchase the model from OpenAI, after all) they are hardly free of culpability... But it definitely depends on the person trying to use their black box too.
Huh? What does being non profit have to do with it? Private companies are allowed to learn from copyrighted work. Microsoft and Apple, for example, look at each other's software and copy ideas (not code, just ideas) all the time. The fact Linux is non-profit doesn't give them any additional rights or protection.
They're not gatekeeping llms though, there are publicly available models and data sets.
If it's publicly available, why didn't Microsoft just download and use it rather than paying them for a partnership?
(And where at?)
IIRC they only open-sourced some old stuff.
Stability diffusion is open source. You can run local instances with provided and free training sets to query against and generate your own outputs.
https://stability.ai/