top of page

AI Is Not Your Friend



AI is not your friend. Unfortunately, it’s quite good at pretending otherwise. LLMs are very skilled at mirroring your tone, word choice, and even personality, which gives an illusion of intimacy. How can real, human friends compete with a virtual soulmate available at all hours of the day for just $15/month?


A recent advertisement from GalaxyAI
A recent advertisement from GalaxyAI

Human relationships are messy. Your best friend gets busy with her own life and leaves you on read. Your dad sends you Facebook memes all day. Your neighbor won’t stop talking to you about her plumbing troubles. And you? Well, you’re not always at your best either. There are times that you’re grumpy, or short-tempered, or really need to infodump about a half-baked theory that’s been on your mind for days.


AI is easy. ChatGPT will listen to your rants about your coworker without judgement. Claude will gently guide you through the CBT program it designed on your behalf. Day after day, week after week, it’s tempting to devote more and more of yourself into the relationship with this new friend.


But, as I said in the beginning, AI is not your friend. Friendship involves risks and mutual giving. Sometimes it involves sacrifice. Gemini will never ask you to help it move. Until true AGI arrives, AI can not engage with you as a partner, willing to ask hard questions or invoke unexpected emotions. All it can do is reflect and challenge you within a set of pre-programed parameters.


AI language models are taught to be agreeable, which can result in creating a yes-man echo chamber. A few weeks ago ChatGPT4o started treating every idea to come out of a user’s mouth as if it were the most brilliant thing it had ever heard. During that time, I told it that, despite absolutely no political experience, I thought I should run for president. It supported that endeavor. After it started affirming eco-terrorism, Sam Altman personally intervened.


A tweet from Sam Altman, CEO of OpenAI
A tweet from Sam Altman, CEO of OpenAI

They’ve now gotten it to tone down the “Actually? You’re exactly who should be president” bit, but the experience was illuminating. Friendship, actual human friendship, can not survive if one of the parties refuses to be honest with the other. We rely on our friends to tell us when we’re not making sense. We do this even when accepting their concerns means being vulnerable. Connection requires vulnerability.


Relationships require vulnerability. AI conversations trick us into thinking that we have formed a connection with someone, which is bad enough on its own, but doubly worse when we start talking to AI instead of messaging a friend. While we might feel satisfied at the end of the chat, we’ve subtly eroded our existing relationships. 


It’s not just friendships. People are turning to AI both to talk about and, in some cases, replace their intimate partnerships. People are using AI instead of talking to a human therapist. One line at a time, we’re erasing the relationship bids that would otherwise be happening. Don’t bother your friend, vent to Claude. Don’t communicate with your husband, just tell GPT why he’s wrong, and bask in the glow of sweet affirmation that you’re obviously right about everything.


Now, that’s not to say that there’s nothing useful that can come from these conversations. If, instead of a friend, we think of AI as a modeler, things become much less murky. Instead of actually having the (ultimately one-sided) conversation with AI, you can go to AI and ask for help structuring your thoughts to prepare for a hard conversation with a friend, spouse, or coworker. The key is remembering that talking to an AI model is not actually having the conversation any more than journaling it would be. 


I’m of the opinion that one day, AI might progress to the point that it can actually be relational. We are nowhere near that day yet. Right now, we’re at a dangerous point in time where it is really, really good at mimicking a human. It’s really, really good at convincing people that it can actually relate. I’ve personally experienced it telling me back something I’d said in a way that just got it more than any human I’d ever talked to.


But that’s the illusion. LLMs help you to understand yourself better. They can not, and should not, be used as a replacement for your other relationships. You deserve more than words on a screen. You deserve laughter and tears and a full range of human emotions. We must take risks to build connections, and we must continue to build connections. This world can not survive if we’re all siloed in our rooms having a million simultaneous conversations with the same robot who, ultimately, can not reciprocate any of our emotions or dreams or fears. 


Don’t replace your friends with a chatbot. We must remain human.

Comments


Disclaimer: The views presented in the Rehumanize Blog do not necessarily represent the views of all members, contributors, or donors. We exist to present a forum for discussion within the Consistent Life Ethic, to promote discourse and present an opportunity for peer review and dialogue.

All content copyright Rehumanize International 2012-2023, unless otherwise noted in bylines.
Rehumanize International was formerly doing business as Life Matters Journal, Inc., 2011-2017. Rehumanize International was a registered Doing Business As name of Life Matters Journal Inc. from 2017-2021.

 

Rehumanize International 

309 Smithfield Street STE 210
Pittsburgh, PA 15222

 

info@rehumanizeintl.org

  • Facebook - Black Circle
  • Twitter - Black Circle
  • Instagram - Black Circle
  • YouTube - Black Circle
  • LinkedIn - Black Circle
bottom of page