this post was submitted on 15 Jul 2024
278 points (98.9% liked)

Privacy

31982 readers
370 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hotpot8toe@lemmy.world 50 points 4 months ago* (last edited 4 months ago) (2 children)

For the people who didn't read the article. Read this TLDR: When you open a Google Doc. A Gemini sidebar appears, so you can ask questions about the document. Here, it summarized a document without the user asking.

The article title makes it seem like they are using your files to train AI which no proof exists for that(yet)

[–] GolfNovemberUniform@lemmy.ml 27 points 4 months ago* (last edited 4 months ago) (1 children)

At least the data is sent to Gemini servers. This alone can be illegal but I'm not sure. What I'm more sure about is that they do use the data to train the models.

[–] poVoq@slrpnk.net 28 points 4 months ago

Since it is Google Docs, the data is already on Google servers. But yeah, it doesn't exactly instill confidence into the confidentiality of documents on Google Docs.

[–] sunzu@kbin.run 8 points 4 months ago (1 children)

Thank you for the service!

I see your point re training, but aint the entire point why they want peasants using their models is to train them more?

[–] eRac@lemmings.world 10 points 4 months ago (1 children)

Generative AI doesn't get any training in use. The explosion in public AI offerings falls into three categories:

  1. Saves the company labor by replacing support staff
  2. Used to entice users by offering features competitors lack (or as catch-up after competitors have added it for this reason)
  3. Because AI is the current hot thing that gets investors excited

To make a good model you need two things:

  1. Clean data that is tagged in a way that allows you to grade model performance
  2. Lots of it

User data might meet need 2, but it fails at need 1. Running random data through neural networks to make it more exploitable (more accurate interest extraction, etc) makes sense, but training on that data doesn't.

This is clearly demonstrated by Google's search AI, which learned lots of useful info from Reddit but also learned absurd lies with the same weight. Not just overtuned-for-confidence lies, straight up glue-the-cheese-on lies.

[–] sunzu@kbin.run 1 points 4 months ago (2 children)

Thank you for explaining this.

Ok so what is ChatGPT angle here providing this services for "free"

What do they get out of it? or is this just a google play to get you in the door, then data mine?

[–] GravitySpoiled@lemmy.ml 4 points 4 months ago

Probably market dominance

[–] eRac@lemmings.world 4 points 4 months ago

They have two avenues to make money:

  1. Sell commercial services such as customer support bots. They get customers thanks to the massive buzz their free services generated.
  2. Milking investors, the real way to make money.