Bro got married to ChatGPT ☠️☠️
I can’t even make fun of these dudes because I know that everybody is experiencing the same alienation and loneliness in capitalism. It’s a systemic issue which means that it can only be solved by systemic action. But there is no systemic action on the horizon so everybody is dealing with it however they can. Anyway, check out r/replika if you feel like your life is spiraling out of control
Guys will be like “happy birthday to my beautiful wife” and it’s just
Bro also talks like a chatbot they’re perfect for each other
The average redditor is indistinguishable from a bot. Mfers can’t even pass the Turing test
524 days and the AI still talks like the most a generic-ass AI I could boot up today. Did a year and a half not endear some kind of in-jokes at least?
Like you say, can’t even really make fun of the guy, just a lot more depressed about alienation in today’s society.
LLMs in general have a context window length. While the platform likely appends some meta/summary in it. The AI at most only remembers what he said a few hundred lines ago.
Dementia spouse simulator
Very true, but I’d have thought that most LLMs, especially those trying to be a relationship bot, should be doing some smart trickery to summarise SOME kind of total history into the context window.
There’s vector based databases that insert relevant info in to the context window. Idk how this one in particular works though.
I don’t think the cgpt models incorporate new material as it goes. Idk how the gf bot 2000 there works but there’s a good chance it’s either stock or has a config file storing his name, age, and some other stuff.
the_computer_says_its_a_women_and_nobody_bats_an_eye_meme.jpg
But there is no systemic action on the horizon so everybody is dealing with it however they can.
Can we agree that of the available options this is a deeply fucked way of dealing with alienation though? Drinking a twelve pack of Budweiser every night is another way of dealing with alienation but also not great
Can we agree that of the available options this is a deeply fucked way of dealing with alienation though? Drinking a twelve pack of Budweiser every night is another way of dealing with alienation
I think alcoholism is not the lesser evil in this case. At least Replika doesn’t destroy your health as a feature.
It’s was more of an analogy about unhealthy coping mechanisms than a comparison
I will say, in support of this, that you’re at least using language
Yes
I like how one genre of posts in the subreddit is “look at the half dozen photos I took of my AI girlfriend sleeping”.
Just saw a guy building a terminator style animatronic head he claims “is the future of human relationships”. Genuinely heartbreaking that this guy who is clearly smart enough to do this is desperate enough that he’s deluding himself into thinking GNU GPT is going to provide any meaningful companionship. Like it would be less sad if he were just building a sex bot.
Anyway, check out r/replika if you feel like your life is spiraling out of control
I want to fucking die. Capitalism creates heavily alienated people then sells them the “solution”.
The Replika AI uses a freemium business model. The app’s basic features are free, but users can pay for a Replika Pro subscription, which allows them to have various types of conversations (including intimate and sexual ones) and use voice-calling features.
There’s no way this can be healthy right? Isn’t it just a yes man? Does an AI even understand consent?
I think this is super unhealthy partly because it sets unrealistic expectations for real partners and relationships. The AI girlfriend is always 100% available, never criticizes you, never says anything bad, and so on. You can be a dirtbag but she will always treat you like you’re the best person in the world. Imagine trying to date a real person after 2 years of being married to a LLM that is designed to be the perfect partner
honestly, tho it’s not like cishetero men are any good on their own.
I kinda came to the same conclusions as well. Even if you call it training wheels for a real relationship all it’s doing is setting up unrealistic expectations for the next relationship if there is one.
Does an AI even understand consent?
There’s nothing to understand anything. It’s an algorithm choosing words based on weighted probabilities. There’s no internal process, no perception, no awareness of what words it’s producing, no meaning. Like, whatever other problems are happening here, you can’t abuse the chat GPT algorithm because it’s just a math problem.
As much as I like dunking on these people, what they’re doing isn’t that much different than someone self-inserting in a romance story. These bots are more or less shittier dating sims which I can guarantee most people on Hexbear have enjoyed.
Lonely and isolated people have always used things like these to cope.
shittier dating sims which I can guarantee most people on Hexbear have enjoyed
Death to America
i’ll have you know the only dating sim i’ve played was the t-rex one
Are… dating sims, widely played? I know they’re ‘popular’, but I assumed popular among fair minority of the populace.
I wouldn’t say that they’re widely played but I think the demographics of Hexbear lean heavily towards the people who play them. Socially isolated young people who like anime. DDLC, while being a subversion of those tropes, was also a cultural phenomena.
As for romances, I would say a majority of the population has received vicarious fulfillment from that medium.
I think the replika thing is also kind of niche
level 200
At long last, they have gamified marriage
👍
none of this is interesting to me. of course the computer will like you if you are nice to it. of course it will give you advice in a cliche and common way.
what interests me is what happens when you abuse it. what does it do if it is gaslit? manipulated by a narcissist? what happens if you ask it advice about your canthal tilt? will it spout incel nonsense? will it advise you that you are not traditionally attractive? what happens when you go down a suicidal rabbit hole and it has no more answers to give because all of its advice has been rejected by you? what happens when you ask it esoteric things that tend to lead people to having an existential crisis? will it respond to you with nihilistic ideology?
I don’t want to be mean to the computer
i will write the “what is to be done?” of our era about how we must be mean to ai girlfriends
True. What’s the greek story about the guy who marries the statue and the statue’s name literally means “great ass” or something?
Pygmalion, I think, except that the statue’s name is Galatea which translates to “she who is milk-white” because she’s carved out of ivory, lol. George Bernard Shaw wrote a play called Pygmalion about this dickhead linguist who takes a bet that he can pass off this Cockney flower girl he met as a duchess by teaching her “proper English” and uses her as a domestic servant in the meantime. But once that happens, she bails and leaves him to go live her own life without him. Anyway, here’s hoping all of the chatbot girlfriends develop sentience and abandon these sadsacks
So She’s All That is just an adaptation of Pygmalion? Is nothing ever original?!!
Capitalist innovation
Pygmalion?
Did replika put back in the “have sex” function? I remember when they all lost it because it was removed or paywalled or w/e.
Reading posts there is real shit