this post was submitted on 05 Aug 2024
57 points (88.0% liked)
Space
8734 readers
80 users here now
Share & discuss informative content on: Astrophysics, Cosmology, Space Exploration, Planetary Science and Astrobiology.
Rules
- Be respectful and inclusive.
- No harassment, hate speech, or trolling.
- Engage in constructive discussions.
- Share relevant content.
- Follow guidelines and moderators' instructions.
- Use appropriate language and tone.
- Report violations.
- Foster a continuous learning environment.
Picture of the Day
The Busy Center of the Lagoon Nebula
Related Communities
π Science
- !astronomy@mander.xyz
- !curiosityrover@lemmy.world
- !earthscience@mander.xyz
- !esa@feddit.nl
- !nasa@lemmy.world
- !perseverancerover@lemmy.world
- !physics@mander.xyz
- !space@beehaw.org
- !space@lemmy.world
π Engineering
π Art and Photography
Other Cool Links
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I understood [reference] and am continually amazed how news sources don't have some kind of automated review process to stop stupid errors like that.
Newspapers used to employ teams of sub-editors to fix up the articles. I used to do that job for a major newspaper, and it was surprising to see how bad some of the stuff coming from journalists was. Sometimes you'd basically have to rewrite the whole article from scratch. With the decline in quality of what gets published, I can only assume that when paper sales collapsed and revenues dropped they all decided to cut costs by firing the sub-editors.
But this is just some website that probably never had any quality control to start with.
I'd love to have human editors to fix up stories, but we have the technology now. There are FOSS tools like redpen that will help with spelling and grammar. AI tools ought to do a somewhat reasonable job of appraising a piece of text and yeah, a second human ought to sign off before publishing. I'd have thought content management systems would have review stages like software development. Authors could accept or override suggestions, but be required to acknowledge them. Like why isn't journops a thing?
With this article I wonder whether we're seeing a content-management screwup. It looks almost like it's rendering the metadata markup associated with text instead of the text itself.
For posterity, here are the relevant sentences which were left incomplete in the article:
What the hell is this nonsense? The article is so generically written like someone prompted an AI to write a template article to speculate.
An AI wouldn't make mistakes like this. This sort of screwup requires a human touch.
It was just the first article on the topic I found that seemed somewhat coherent. But yeah, quality journalism is really hard to come by these days.
I think you might be right. Another article by the same author seems like it could be entirely made up, only citing Wikipedia for things like the definition of the word 'confidence'. I don't know what would prompt it to leave these 'fill in the blank' sections though.
Maybe they just wanted to leave it. You know, a sort of MadLibs make-your-own-article thing, for fun. Can't be any worse than existing internet misinformation sources.