they will be
Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
If you have trained the model or part of it yourself, I'd say yes, some copyright should apply to that. Depending on the images that were used of course. The copyright should apply less to the generated image and more to the model itself. You should also be able to copyright images that have been sufficiently altered after the initial generation IMO.
I think they should be copyrightable. AI is just a tool for the artist like a paintbrush, an art program (and now some of those even have AI tools built-in) or even filters on photos. Even if using others' original works to train the AI, the result should be transformative which is already a mechanism that exists within US Copyright Fair Use.
As AI image generation methods improve, it will become difficult if not impossible to distinguish between an image being generated by AI or with the help of AI or not. Even if the stance will universally become "no" how could it actually be enforced? What sort of objective validation could happen that always gets it right?
Furthermore, how much would someone need to change the end-product to not be considered "AI created" anymore anyway? How transformative must it be?
Regardless of the answer now, it is almost certain that the answer in the future will be "yes".