I mean, yeah that’s true, but harm reduction is also a thing that exists. Usually it’s mentioned in the context of drugs, but it could easily apply here.
Interesting take, addiction to the convenience provided by AI driving the need to get more. I suppose at the end of the day it’s probably the same brain chemistry involved. I think that’s what you’re getting at?
I’m any case, this tech is only going to get better, and more commonplace. Take it, or run for the hills.
Oh, right. Microsoft is a corp. They don’t care about the harm they do until it costs them money.
e: also, I love to bash on ms, but they’re not the problem here. These things are being built all over the place… In companies, in governments, in enthusiasts back yard. You can tell Microsoft, Google, Apple to stop developing the code, you can tell nvidia to stop developing cuda. It’s not going to matter.
I suppose, could be harm reduction. Like peeling a bandaid off slowly instead of ripping it off.
They’re here, they might not be everywhere yet, but they’re here to stay as much as photoshopped images or trick photography are. Just more lies to hide the truth.
All we can do now is get better at dealing with them.
Everyone is. As time and tech progresses, you’re going to find that it becomes increasingly difficult to avoid without going off-grid entirely.
Do you really think corps aren’t going to replace humans with AI, any later than they can profit by doing so? That states aren’t going eventually to do the same?
I mean, yeah that’s true, but harm reduction is also a thing that exists. Usually it’s mentioned in the context of drugs, but it could easily apply here.
Interesting take, addiction to the convenience provided by AI driving the need to get more. I suppose at the end of the day it’s probably the same brain chemistry involved. I think that’s what you’re getting at?
I’m any case, this tech is only going to get better, and more commonplace. Take it, or run for the hills.
deleted by creator
Ah, so more like self-harm prevention, gotcha.
I guess like any tool, whether it is help or harm depends on the user and usage.
deleted by creator
Oh, right. Microsoft is a corp. They don’t care about the harm they do until it costs them money.
e: also, I love to bash on ms, but they’re not the problem here. These things are being built all over the place… In companies, in governments, in enthusiasts back yard. You can tell Microsoft, Google, Apple to stop developing the code, you can tell nvidia to stop developing cuda. It’s not going to matter.
deleted by creator
I suppose, could be harm reduction. Like peeling a bandaid off slowly instead of ripping it off.
They’re here, they might not be everywhere yet, but they’re here to stay as much as photoshopped images or trick photography are. Just more lies to hide the truth.
All we can do now is get better at dealing with them.
deleted by creator
I’m heading for the hills then. I’m perfectly capable of thinking for myself without delegating that to some chatbot.
Everyone is. As time and tech progresses, you’re going to find that it becomes increasingly difficult to avoid without going off-grid entirely.
Do you really think corps aren’t going to replace humans with AI, any later than they can profit by doing so? That states aren’t going eventually to do the same?