Because this article is posited (with its title and the little blurb at the top about the author) to be about the safety of AI.
Unless the title and blurb have changed, this is just wrong.
The title says nothing about safety: “How AI’s booms and busts are a distraction - However current companies do financially, the big AI safety challenges remain.”
Likewise the blurb says nothing about safety: “Kelsey Piper is a senior writer at Future Perfect, Vox’s effective altruism-inspired section on the world’s biggest challenges. She explores wide-ranging topics like climate change, artificial intelligence, vaccine development, and factory farms, and also writes the Future Perfect newsletter.”
What are you going on about? You’re mad because you couldn’t tell this was on Op/Ed?
(Sidenote: I didn’t notice that “effective altruism” thing before. Barf.)
The blurb suggests that this person writes specifically altruist articles (a suggestion that this is for the benefit of someone which by proxy suggests that it’s telling the truth). Because opinions are subjective that conflicts with the context of the piece pretty harshly. It gives the idea that it may in some way be an opinion based in on fact when it simply isn’t because it cites no factual data that can be quantified whatsoever. This is literally how misinformation is spread. It doesn’t have to be outright lies in order to be damaging.
The article talks about how new safety measures could be developed. It’s in the text. It just doesn’t conclude anything or talk about any specifics. That’s really my problem with it. What good is the opinion of the author? What are they basing this opinion on? There’s no substance to this writing at all.
Unless the title and blurb have changed, this is just wrong.
The title says nothing about safety: “How AI’s booms and busts are a distraction - However current companies do financially, the big AI safety challenges remain.”
Likewise the blurb says nothing about safety: “Kelsey Piper is a senior writer at Future Perfect, Vox’s effective altruism-inspired section on the world’s biggest challenges. She explores wide-ranging topics like climate change, artificial intelligence, vaccine development, and factory farms, and also writes the Future Perfect newsletter.”
What are you going on about? You’re mad because you couldn’t tell this was on Op/Ed?
(Sidenote: I didn’t notice that “effective altruism” thing before. Barf.)
The blurb suggests that this person writes specifically altruist articles (a suggestion that this is for the benefit of someone which by proxy suggests that it’s telling the truth). Because opinions are subjective that conflicts with the context of the piece pretty harshly. It gives the idea that it may in some way be an opinion based in on fact when it simply isn’t because it cites no factual data that can be quantified whatsoever. This is literally how misinformation is spread. It doesn’t have to be outright lies in order to be damaging.
The article talks about how new safety measures could be developed. It’s in the text. It just doesn’t conclude anything or talk about any specifics. That’s really my problem with it. What good is the opinion of the author? What are they basing this opinion on? There’s no substance to this writing at all.
deleted by creator
Possibly because you read the article. But whatever I guess. It is just my opinion, after all.