this post was submitted on 12 Jul 2024
303 points (100.0% liked)

Technology

37730 readers
639 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] theluddite@lemmy.ml 117 points 4 months ago* (last edited 4 months ago) (2 children)

Investment giant Goldman Sachs published a research paper

Goldman Sachs researchers also say that

It's not a research paper; it's a report. They're not researchers; they're analysts at a bank. This may seem like a nit-pick, but journalists need to (re-)learn to carefully distinguish between the thing that scientists do and corporate R&D, even though we sometimes use the word "research" for both. The AI hype in particular has been absolutely terrible for this. Companies have learned that putting out AI "research" that's just them poking at their own product but dressed up in a science-lookin' paper leads to an avalanche of free press from lazy credulous morons gorging themselves on the hype. I've written about this problem a lot. For example, in this post, which is about how Google wrote a so-called paper about how their LLM does compared to doctors, only for the press to uncritically repeat (and embellish on) the results all over the internet. Had anyone in the press actually fucking bothered to read the paper critically, they would've noticed that it's actually junk science.

[–] tal@lemmy.today 17 points 4 months ago (1 children)

A big part of the problem -- and this is not a new issue, goes back decades -- is that a lot of terms in AI-land don't correspond to concrete capabilities, so it's easy to claim that you do X when X is generally-perceived to be a much-more-sophisticated thing than what you're actually doing, even if your thing technically qualifies as X by some definition.

None of this in any way conflicts with my position that AI has tremendous potential. But if people are investing money without having a solid understanding of what they're investing in, there are going to be people out there misrepresenting their product.

[–] scrubbles@poptalk.scrubbles.tech 3 points 4 months ago (1 children)

Just like how it's no coincidence that they change the definition of AI to AGI.

[–] MalReynolds@slrpnk.net 1 points 4 months ago

It'll be ASI before ppl acknowledge AGI

[–] dev_null@lemmy.ml 11 points 4 months ago (1 children)

Same with all cryptocurrencies having a "white paper", as if it was anything other than marketing crap formatted like a scientific paper.

[–] verstra@programming.dev 4 points 4 months ago (1 children)

It started as actual unpublished technical descriptions of underlying technology.

[–] CanadaPlus@lemmy.sdf.org 1 points 4 months ago* (last edited 4 months ago)

Yeah, I've seen some good ones. Sad to hear the term has gone to shit.