If enforcement means big tech companies have to throw out models because they used personal information without knowledge or consent, boo fucking hoo
A) this article isn't about a big tech company, it's about an academic researcher. B) he had consent to use the data when he trained the model. The participants later revoked their consent to have their data used.
It's not an assumption. There's academic researchers at universities working on developing these kinds of models as we speak.
I'm not wasting time responding to straw men.