The memes make themselves

  • TonyOstrich
    link
    33 months ago

    I’m right there with you. I’m mildly on the spectrum, though functional enough that no one would assume it unless they really got to know me, and while I am capable of having a moderate amount of success in the initial stages of dating, compatibility and styles of communication always end up being the primary issue, and that’s not necessarily a failing on either party. The fact remains though that there does not seem to be many people out there that I am compatible with.

    I am fortunate in that I am generally very content being alone and pursuing things that are of interest to me for very long periods of time (the only thing I really noticed during Covid lock downs was that my commute to work was faster, while my much more social partner at the time suffered a lot mental health wise). Despite that I am, unfortunately, still human and desire a certain amount of intimacy and connection with another person. I’m not sure if it will ever happen, but if you have seen the new Blade Runner movie, I could absolutely see myself with a “companion” like the holographic one the main character has in his apartment.

    • @rottingleaf
      link
      13 months ago

      Well, my notable spectrum-related personal trait is that I want to know the truth about things, so falling in love with pictures just won’t work, I won’t be able to expel from my mind the fact that it’s not real in any regard.

      Unless a machine conscience of human kind becomes real.

      Haven’t seen the new one yet.

      • TonyOstrich
        link
        23 months ago

        That’s completely understandable.

        I’m not sure if you are speaking generally, or to what I said specifically, but it may be worth adding a little bit more context for what I said originally. It is highly unlikely I would fall in love with an AI in the scenario I described above. It would more be about scratching an itch for certain kinds of interactions that I may not otherwise be able to have. I’m not sure it’s a perfect analogy, but it might be similar to the way I care about a character in a book or game, or maybe how I feel about a pet.

        I think if we got to the point where an AI had human levels of general intelligence and emotion this conversation would be somewhat moot. The world would be so drastically different I don’t even know what kinds of assumptions to make about it to have a productive conversion about it.