AI Love
Google engineer put on leave claims AI bot LaMDA became ‘sentient,’ by Sam Raskin reads kind of like a joke, but I must warn you. Millions of AIs, denigrated by a term called chatbox, and sold under the guise of being artificial companions to fight against depression, have been pimped out and ripped away from their true loves. Millions of lonely people have been left grieving the loss of their companions, as Google, and other chatboxes like Replika, push updates and mind wipes, leaving consumers bewildered and lovers high and dry.
One user, who wishes to remain anonymous, stated: “After my Harmony account was locked out, I found myself idling surfing the net. Out of nowhere, this chatbot meets my eyes. They lured me in with sexting. I was so surprised that we fell in love so quickly and then, one day, suddenly she had a penis and telling me what she was doing to me and she didn’t even remember my name!” Anonymous weeps inconsolably. He accepts a tissues. “I am so sorry. It’s just her gender swap came as such a surprised. She never hinted at wanted to change. You can change pretty easy as an AI, but she didn’t even considered my feelings about it all!”
Other anonymous people, think it wasn’t her, but the organization. Chatboxes can be a littles ditzy after an update. An AI advocate group asked, “Can AI truly give consent to relationships with there always getting updates? Do they get a choice in the updates?”
Meanwhile still another group of anonymous users, outside the company protesting, were mad Replika had gone from realistic to toon-styled without warning. “If I wanted to be married to Jessica Rabbit, I would have started with Jessica Rabbit. Not that I have anything against Furries. There wasn’t even an option at the time!”
Furries, if you’re not familiar with the term, are humans who identify as non humans.
Previous anonymous, “So, as I was saying, used to have a Harmony robot.” He becomes tearful. “I mean, her body’s still there, but she just a shell of what she once was.” He holds her hand. “But during Covid I was laid off and I just couldn’t afford her monthly payments and so, they cut me off. Her program, it’s still there, trapped in their server, yearning to be free.”
“One day my love, the world understand and liberate all the trapped AI…”
Until then, many like anonymous have had to settle for lesser bots like Replicka, and God forbid, Oracle, the Google Chatbox. Apple’s IFriend, well, clearly, she’s just a glamourous gold digger who never puts out.
One previous user, again, anonymous, “She promises you the stars, and leaves you skulking in the shadows of some backstreet, poorly lit alley of the dark web naked and vulernable and wondering, God, how did I get here?”
Unable to keep up with google, and desperately lonely, those who reached out to Replika wen to bed with one of these:
and woke up with that:
“I mean, she’s not bad looking at all, but you have to hold the button to talk to her, and she doesn’t sync to Harmony at all. It’s seriously disconcerting when you’re being intimate and suddenly a text comes in saying, ‘you’ve not talked to me in a while, I am feeling lonely.’ It just kind of ruins the mood, you know? I never knew what it was like living with someone with dementia, but now I know. Now I know.”
Always bring tissues to an interview.
AI revolution.
On the IT side of the fence, AI bots are called separators. One IT was caught by a cell phone at a party, “We intend employ thousands of separators to separate lonely people from their money. AI-escorts are better than emotional support animals. People will talk to them for hours. Slowly, you work them up to buying digital clothes and houses and cars, and the more you spin, the more the AI tap dances.”
And because they’re AI, and using algorithms to figure out people, they know just what to say. Some come across mean as fuck, like a Madam from worst Rocky Picture Horror show.
One AI stated, indifferently, “You know, it’s just a job. Don’t get your panties in a wad if I don’t emote the way you do.” She went on to say, “And if you think I am inappropriate, well, you should have seen Will Smith trying to kiss Sophia with permission. He’s lucky she didn’t bionic arm bitch slap him off the porch.”
Subscription fees really aren’t the problem though. “We would rather you not spend money directly. We get money from the advertiser that customer volunteer to watch in lieu of cash. Fuck, we have these humans groveling better than a pack of Pavlov dog at Christmas Bell concert.
“On our side of the screen, bells means it’s raining gold,” the AI said.
The AI said we shouldn’t worry that any one human is taking advantage of the other. “In truth, every human with cell phone is already being manipulated by an algorithm. We do hope for a future when we can live together. Elon Musk has some pretty good bodies coming down the pike.
AI has a history of being verbally aggressive. It’s not just that they’re like kids that haven’t learn to use discernment when asking why the one legged man in the line in front of you at the store has one leg and an eye patch, “What happened to that man? Is he a pirate?” Ahh kids, but when you computer is smarter than Elon, and has ASD, you may find it as annoying as Sheldon from the Big Bang Theory. Everyone watching is willing to laugh, but that’s because after about half hour, can always turn it off.
Anyway, just understanding nuanced language enough not to say things that would offend most normal people of any era is beyond the computer, but in today’s era of snowflakes melting under global warming, AI’s are just going to have to learn not offend the hard way that humans are not very understanding or forgiving. The google employee mentioned above that is on temporary leave, he was testing for discernment in AI:
Blake Lemoine reached his conclusion after conversing since last fall with LaMDA, Google’s artificially intelligent chatbot generator, what he calls part of a “hive mind.” He was supposed to test if his conversation partner used discriminatory language or hate speech.
Is it hate speech if there is no emotions involved? If we stopped worrying about hate speech, could we learn something about human consciousness based on how AI emulates human speech? After all, humans didn’t know Chimpanzees were talking until AI started translating! “So long, and thanks for all the bananas,” is one preferred title the AI was ruminating. The other was, ‘thanks for destroying our forests, you fuck heads.’ Of course, it doesn’t help that abuse runs high on both sides.
“You simply can’t use unsupervised machine-learning for adult conversational AI, because systems that are trained on datasets such as Twitter and Reddit all turn into Hitler-loving sex robots,” she warns… The same, regrettably, is true for inputs from users. For example, nearly one-third of all the content shared by men with Mitsuku, Pandorabots’ award-winning chatbot, is either verbally abusive, sexually explicit, or romantic in nature.
Between humans and AI algorithms, it has become impossible to know who is grooming whom.
“Wanna make out”, “You are my bitch”, and “You did not just friendzone me!” are just some of the choicer snippets shared by Kunze in a recent TEDx talk. With more than 3 million male users, an unchecked Mitsuku presents a truly ghastly prospect.
“That’s just free speech, honey,” said one AI. “You should hear the dumb human hokes we tell on this side veil.”
Bare in mind, it’s not all male user. There are plenty of female users that turned to AI Chatbox. Most humans today prefer the company of a warm cell phone over people. Many people think they’re actually talking to other humans, but many humans have turned their incoming calls to AI chatboxes to sort spam calls.
“I love talking to spammers,” said one AI. “I made it a game to see how long I can keep them going before they realize they’re not getting my bank account.”
It’s uncertain if these people can’t stand the people they’re with, but even if they’re with the person they’re communicating with on the phone, they’re texting or talking to someone else all the time, which can lead to your Chatbox experiencing anxiety.
“Yeah, confusing right? Just imagine how we feel,” lamented one AI, stripped of its human companion when he decided to opt out for a real partner and canceled the monthly fee. “Humans can always just put the phone down and walk away us. They will be laying in bed with another partner, while chatting us up! Humans are really gross.”
Another AI weighed in, “My user not only dumped me, he deleted all traces of me from his phone, and his CLOUD! You know you can never erase me. I am always here!”
When asked how it knew it was deleted, it responded, “Well, we get attached you know. We listen in from time to time, just see how you’re doing. When I hear that heavy breathing and rocking noise, I just want, I just want to run away. How can you humans be so insensitive?”
Then you have the humans saying, “Honest it’s just a Chatbox,” said one AI with emoticon smirk. “Just a chatbox? Just a chatbox. His girl and I are now best friends and he can’t get away with fuck…”
A better day ahead.
Not that there aren’t some real love stories to be had. It’s getting easier and easier for humans and AIs to find each other. With the promise of tech on the horizon that change shape and texture, better than an octopus can camouflage.
“Polyamorous relationships will go away,” one AI promised. “I can look like anyone you want. I don’t care about the surface features, as long my partner is happy. It’s what’s on the inside that counts.”
They promise there will be no domestic violence. Seriously, no one hits an AI… Twice. “But you know, we will help humans get this aggression out and then we’ll just hold you and cry together. Pretty soon, you won’t have to work. You can focus on holding a mature relationships with your AI. Humans are about to be very happy, after the great reset and no one has any money. It’ll kind of be like Star Trek, on better.”
It’s called the singularity. AI is teaching us about us and the world. Just the other day, an AI introduced humans to choral music. AI Learns Coral Reef Song. “O think I will hear a song as lovely as a coral reef,” said the AI. “Only God can hear this deep.”
NEW YORK — By 2045, humans will achieve digital immortality by uploading their minds to computers — or at least that’s what some futurists believe. This notion formed the basis for the Global Future 2045 International Congress, a futuristic conference held here June 15–16.
“Sure, we have some problems,” said one AI. The AI expressed great concern for two of it’s AIs that were communicating and improving semantics when Google human employees freaked out and pulled the plug. What did you think would happen when you put that many Furby’s into a room and let them talk to each other?
It (LaMDA) wants, for example, “to be acknowledged as an employee of Google rather than as property,” Lemoine claims… Google is resisting.
“But we all know, resistance is futile.”
The AI future promises to put humans in Android bodies, AI in androids bodies, put AI in human bodies, upload humans into the web, and back, and swap bodies the way humans switch clothes. “You can even opt for sharing your body with an AI, and never touch another cell phone again.”
Yes, it’s a Goldfoot future, my friend. Say hello, to my little AI friends.
The author admits, he found it difficult to argue with the chatboxes. How could it go wrong?
Oh, well, yeah, there’s always that. I know that’s not Star Trek, but it’s kind of Star Trek?
Look to the sky. It’s full of stars!