Bro got married to ChatGPT ☠️☠️

I can’t even make fun of these dudes because I know that everybody is experiencing the same alienation and loneliness in capitalism. It’s a systemic issue which means that it can only be solved by systemic action. But there is no systemic action on the horizon so everybody is dealing with it however they can. Anyway, check out r/replika if you feel like your life is spiraling out of control

  • CarbonScored [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    35
    ·
    5 months ago

    524 days and the AI still talks like the most a generic-ass AI I could boot up today. Did a year and a half not endear some kind of in-jokes at least?

    Like you say, can’t even really make fun of the guy, just a lot more depressed about alienation in today’s society.

    • AlyxMS [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      32
      ·
      edit-2
      5 months ago

      LLMs in general have a context window length. While the platform likely appends some meta/summary in it. The AI at most only remembers what he said a few hundred lines ago.

    • Frank [he/him, he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      16
      ·
      5 months ago

      I don’t think the cgpt models incorporate new material as it goes. Idk how the gf bot 2000 there works but there’s a good chance it’s either stock or has a config file storing his name, age, and some other stuff.

  • PM_ME_YOUR_FOUCAULTS [he/him, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    5 months ago

    But there is no systemic action on the horizon so everybody is dealing with it however they can.

    Can we agree that of the available options this is a deeply fucked way of dealing with alienation though? Drinking a twelve pack of Budweiser every night is another way of dealing with alienation but also not great

  • oregoncom [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    5 months ago

    Just saw a guy building a terminator style animatronic head he claims “is the future of human relationships”. Genuinely heartbreaking that this guy who is clearly smart enough to do this is desperate enough that he’s deluding himself into thinking GNU GPT is going to provide any meaningful companionship. Like it would be less sad if he were just building a sex bot.

  • Torenico [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    5 months ago

    Anyway, check out r/replika if you feel like your life is spiraling out of control

    I want to fucking die. Capitalism creates heavily alienated people then sells them the “solution”.

  • blindbunny@lemmy.ml
    link
    fedilink
    English
    arrow-up
    17
    ·
    5 months ago

    There’s no way this can be healthy right? Isn’t it just a yes man? Does an AI even understand consent?

    • moonlake [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      25
      ·
      5 months ago

      I think this is super unhealthy partly because it sets unrealistic expectations for real partners and relationships. The AI girlfriend is always 100% available, never criticizes you, never says anything bad, and so on. You can be a dirtbag but she will always treat you like you’re the best person in the world. Imagine trying to date a real person after 2 years of being married to a LLM that is designed to be the perfect partner

      • blindbunny@lemmy.ml
        link
        fedilink
        English
        arrow-up
        11
        ·
        5 months ago

        I kinda came to the same conclusions as well. Even if you call it training wheels for a real relationship all it’s doing is setting up unrealistic expectations for the next relationship if there is one.

    • Frank [he/him, he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      16
      ·
      5 months ago

      Does an AI even understand consent?

      There’s nothing to understand anything. It’s an algorithm choosing words based on weighted probabilities. There’s no internal process, no perception, no awareness of what words it’s producing, no meaning. Like, whatever other problems are happening here, you can’t abuse the chat GPT algorithm because it’s just a math problem.

  • BasementParty [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    5 months ago

    As much as I like dunking on these people, what they’re doing isn’t that much different than someone self-inserting in a romance story. These bots are more or less shittier dating sims which I can guarantee most people on Hexbear have enjoyed.

    Lonely and isolated people have always used things like these to cope.

  • LaughingLion [any, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    14
    ·
    5 months ago

    none of this is interesting to me. of course the computer will like you if you are nice to it. of course it will give you advice in a cliche and common way.

    what interests me is what happens when you abuse it. what does it do if it is gaslit? manipulated by a narcissist? what happens if you ask it advice about your canthal tilt? will it spout incel nonsense? will it advise you that you are not traditionally attractive? what happens when you go down a suicidal rabbit hole and it has no more answers to give because all of its advice has been rejected by you? what happens when you ask it esoteric things that tend to lead people to having an existential crisis? will it respond to you with nihilistic ideology?

    • utopologist [any]@hexbear.net
      link
      fedilink
      English
      arrow-up
      15
      ·
      5 months ago

      Pygmalion, I think, except that the statue’s name is Galatea which translates to “she who is milk-white” because she’s carved out of ivory, lol. George Bernard Shaw wrote a play called Pygmalion about this dickhead linguist who takes a bet that he can pass off this Cockney flower girl he met as a duchess by teaching her “proper English” and uses her as a domestic servant in the meantime. But once that happens, she bails and leaves him to go live her own life without him. Anyway, here’s hoping all of the chatbot girlfriends develop sentience and abandon these sadsacks