The author acknowledges that AI-based relationships, such as those offered by Replika, may provide a sense of intimacy and companionship for those who struggle to form connections in real life, possibly even helping to alleviate loneliness and assisting in finding long-term relationships. However, the author is conflicted about the rise of these “girlfriend-like AI experiences” (GFAIE), arguing that they may also create negative externalities, such as reducing the incentive to seek out real relationships, destabilizing existing ones, and potentially leading to greater loneliness and social isolation. While AI bots may offer a cheap and convenient alternative to human intimacy, the author suggests that they can never truly replicate the mutuality and emotional depth of real relationships, and thus may ultimately do more harm than good.

by Llama 3 70B

  • @retrospectology
    link
    47 months ago

    I don’t think the people getting into AI relationships are going to be people who would be in relationships in the abscence of AI.

    • @pavnilschandaOPM
      link
      17 months ago

      That has been addressed in the article:

      Assuming all data is secure and private – granted, not always a safe assumption – the main concern is that the widespread availability of quasi-intimacy on demand will result in fewer people bothering to find a real relationship. The result will be greater loneliness, weaker social bonds and slower population growth – none of which is good for the economy or humanity. Again, to analogise to GFE, I know from my research that some people would stay single with or without it, while others eventually will get over their issues and find human companionship. The concern is that AI would make it too easy for people never to risk intimacy.

  • SatansMaggotyCumFart
    link
    27 months ago

    If you’re in a relationship and you seek out an AI companion you are cheating.

    • @lemick24
      link
      67 months ago

      Thank you for your wisdom, satansmaggotycumfart