• hendrik
    link
    fedilink
    English
    4
    edit-2
    1 month ago

    At some point I’d like to see that “addictiveness” quantified. I mean I read that a lot. I also think there is something to it. But unless we do some studies and measure it, it’s just speculation. Sounds reasonable but might be entirely false.

    Also I think being addicted requires some negative effects (per definition). I’d also like to have that quantified. Do people waste more time with chatbots than the average person watching TV, YouTube, playing single player computer games or doing other non-social things?

    This creates an echo chamber of affection that threatens to be extremely addictive.

    I’m not sure if that applies to many people. I’ve heard from several people, they don’t like the text chats with chatbots or doing roleplay with AI. Same applies to things like playing pen and paper roleplaying games. That mainly works with people who also like to read books and picture an imaginatory world in their minds. It’s not palpable. And I don’t think it really replaces socializing with real people. There is so much missing. Non-verbal cues, hearing someone laugh when you crack a joke… Someone telling you you’re wrong and you get to learn something. And I’ve tried it and I definitely can’t feel love or affection being part of engaging with AI. And I don’t think a moving avatar on some screen will change that for me.

    And we have other factors at play. This week I read an article that we’re having substancially less sex than 10 years ago. I’m not sure if the amount of singles and people who don’t want to engage in a relationship are on the rise anyways. And that dynamic predates the availability of chatbots to the general public, so it’s not related to that. We also have some countries where lots of people don’t have a partner. And some niche cultures also in western countries.

    Plus we already have parasocial relationships to Twich streamers, influencers and stars. And lots of people watching that. Which I’d argue is a similar thing and entirely without AI.

    Now AI gets added to the mix. But it’s not the cause for any of the things I just mentioned. I think it’s wrong to just say chatboys are a problem. The interesting question is: What do they do when added to the mix? Do they alleviate some pain? Do they make it worse? Are they able to contribute anything to the lives of their users?
    None of that is known as of now. And it has to be judged in context.

    And a hypothetical risk doesn’t do it for me. Sure, maybe AI can pose a big risk regarding addiction. But I could also literally die while driving to the supermarket. Just telling hypothetical worst-case scenarios doesn’t matter that much.

    Btw, I think it’s a good article. And it contains lots of references and links to other interesting content. And I learned something. For example what “sycophancy” is.

    “dark patterns”

    Is a good example, too. Why think about addictive AI when we allow TikTok and all major internet platforms to design their services so they’re maximally addictive?

    This “regulation by design” approach could seek to make interactions with AI less harmful by designing the technology in ways that make it less desirable as a substitute for human connections […]

    And how would that fly in a free country? Do we also outlaw other addictive or harmful things? Like books that could be misused? MMORPGs? Social Media? Alcohol? Tobacco?

    In the end I have zero interest in that techno dystopia where my AI mommy decides what is healthy or addictive for me and provide me with a dull and cushioned dystopian world. I’m an adult and want to be treated like one. I want to smoke alcohol, ride my bike down a steep hill and eat an unhealthy BicMac if I like. Or the AI waifu if I want one. And I don’t want some computer to dictate what’s good for me.

    But we definitely need more science and studies done on the effects of AI. And exact numbers to decide what’s the right thing to do.