this post was submitted on 28 Jun 2024
396 points (100.0% liked)

196

16504 readers
3868 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Hjalamanger@feddit.nu 9 points 4 months ago

It's very much possible and indeed such a problem that it may be done by mistake if a large enough data set isn't used (see overfitting). A model trained to output just this one image will learn to do so and over time it should learn to do it with 100% accuracy. The model would simply learn to ignore whatever arbitrary inputs you've given it