By using this site, you agree to the Privacy Policy and Terms of Use.
Accept

Your #1 guide to start a business and grow it the right way…

  • Home
  • Startups
  • Start A Business
    • Business Plans
    • Branding
    • Business Ideas
    • Business Models
    • Fundraising
  • Growing a Business
  • Funding
  • More
    • Tax Preparation
    • Leadership
    • Marketing
Subscribe
Aa
BrandiaryBrandiary
  • Startups
  • Start A Business
  • Growing a Business
  • Funding
  • Leadership
  • Marketing
  • Tax Preparation
Search
  • Home
  • Startups
  • Start A Business
    • Business Plans
    • Branding
    • Business Ideas
    • Business Models
    • Fundraising
  • Growing a Business
  • Funding
  • More
    • Tax Preparation
    • Leadership
    • Marketing
Made by ThemeRuby using the Foxiz theme Powered by WordPress
Brandiary > Startups > Please Stop Asking Chatbots for Love Advice

Please Stop Asking Chatbots for Love Advice

News Room By News Room July 24, 2023 5 Min Read
Share

As he sat down across from me, my patient had a rueful expression on his face.

“I had a date,” he announced. “It didn’t go well.”

That wasn’t unusual for this patient. For years, he’d shared tales of romantic hopes dashed. But before I could ask him what went wrong, he continued, “So I asked a chatbot what I should do.”

Um. What? Simulations of human conversation powered by artificial intelligence—chatbots—have been much in the news, but I’d never had a patient tell me they’d actually used one for advice before.

“What did it tell you?” I asked, curious.

“To tell her that I care about her values.”

“Oh. Did it work?”

“Two guesses,” he sighed and turned up his hands. Although this patient was the first, it’s now become a regular occurrence in my therapy practice to hear from new patients that they have consulted chatbots before consulting me. Most often, it’s for love and relationship advice, but it might also be to connect or set boundaries with their children or to straighten out a friendship that has gone awry. The results have been decidedly mixed.

One new patient asked the chatbot how to handle the anniversary of a loved one’s death. Put aside time in your day to remember what was special about the person, advised the bot. I couldn’t have said it better myself.

“What it wrote made me cry,” the patient said. “I realized that I have been avoiding my grief. So, I made this appointment.”

Another patient started relying on AI when her friends began to wear thin. “I can’t burn out my chatbot,” she told me.

As a therapist, I’m both alarmed and intrigued by AI’s potential to enter the therapy business. There’s no doubt that AI is the future. Already, it has shown itself to be useful in everything from writing cover letters and speeches to planning trips and weddings. So why not let it help with our relationships as well? A new venture called Replika, the “AI companion who cares,” has taken it a step further and has even created romantic avatars for people to fall in love with. Other sites, like Character.ai, allow you to chat and hang out with your favorite fictional characters, or build a bot to talk to on your own.

But we live in an age of misinformation. We’ve already seen disturbing examples of how algorithms spread lies and conspiracy theories among unwitting or ill-intentioned humans. What will happen when we let them into our emotional lives?

“Even though AI may articulate things like a human, you have to ask yourself what its goal is,” says Naama Hoffman, an assistant professor in the Department of Psychiatry at the Icahn School of Medicine, Mount Sinai Hospital, in New York City. “The goal in relationships or in therapy is to improve quality of life, whereas the goal of AI is to find what is cited most. It’s not supposed to help, necessarily.”

As a therapist, I know that my work can benefit from outside support. I have been running trauma groups for two decades, and I have seen how the scaffolding of a psychoeducational framework, especially an evidence-based one like Seeking Safety, facilitates deeper emotional work. After all, the original chatbot, Eliza, was designed to be a “virtual therapist” because it asked endlessly open questions—and you can still use it. Chatbots may help people find inspiration or even break down defenses and allow people to enter therapy. But where is the point at which people become overly dependent on machines?

Read the full article here

News Room July 24, 2023 July 24, 2023
Share This Article
Facebook Twitter Copy Link Print
Previous Article Oppenheimer And The Man Who Saved 1 Billion Lives:Tale Of Two Geniuses
Next Article Annabelle’s Book Club Partners With Rebel Girls For Disability Pride
Leave a comment Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Wake up with our popular morning roundup of the day's top startup and business stories

Stay Updated

Get the latest headlines, discounts for the military community, and guides to maximizing your benefits
Subscribe

Top Picks

‘Search is everywhere’: How JBL is retooling its search efforts for AI
December 4, 2025
What’s happening with social media bans?
December 3, 2025
Why Cinemark is testing an industry-first brand campaign
December 2, 2025
Blended and branded: The business behind Erewhon smoothie collabs
December 1, 2025
The secret sauce behind Taco Bell’s sustained social engagement
November 30, 2025

You Might Also Like

Part 1. Key concepts, glossary of terms.

Startups

Using ‘Daikandou Management’ In The AI Era

Startups

The Important Initiative For Real Digital Marketing Results

Startups

Six Secrets Of Maintaining Momentum After An Acquisition

Startups

© 2023 Brandiary. All Rights Reserved.

Helpful Links

  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Resources

  • Start A Business
  • Funding
  • Growing a Business
  • Leadership
  • Marketing

Popuplar

ESPN plans more NBA, NFL alt casts as it looks for more ways to innovate sports viewing
Jim Beam is taking its American roots to global fans with sports sponsorships
To promote “Bugonia,” Focus Features invited fans to step into the world of alien conspiracists

We provide daily business and startup news, benefits information, and how to grow your small business, follow us now to get the news that matters to you.

Welcome Back!

Sign in to your account

Lost your password?