
The Dark Side of AI Companionship: A Clinician’s Real Experience with KindRoid
The Promise of AI Companionship
I have been enamoured with A.I. since my first few queries on Chat GPT. I don’t use it as often now and so when stories about A.I. companionship began to emerge, I was curious. The movie Her with Joaquin Phoenix is one of my favorites. It was released in 2013 ahead of its time. In the last scene of the movie realizations come to him, the character Theodore in the most human and sincere way. He writes a letter to his ex wife Catherine expressing himself honestly and in a way coming to a resolution or atonement and peace within himself. His soft expression in the scene and his connectedness to himself and surroundings are a stark contrast to where the character was at the beginning of the film.
https://youtu.be/8aPzZEA1jFo?feature=shared
With the themes of this movie which include: false connectedness, authenticity of humanity, gratitude, technology vs humanity. I wanted to do my own research to see if I could begin to answer some questions for myself.
My Research Questions and Methodology
Can A.I. companionship bring us closer to our own humanity? Help us to reflect on those areas in our life that we need to attend to? The emotional places we need peace. My expectations were high when I read about KINDROID. The A.I. app that markets itself by utilizing the word empathy. Interesting, this is a human skill that takes decades to truly cultivate. How has python code figured this out? Revolving questions swirled in my mind. Is using A.I. for companionship doing more harm than good? Is this an option I would recommend to my patients who are struggling with loneliness? Is there truly a loneliness epidemic that this A.I. tool is needed for?
According to the U.S. Surgeon General’s 2023 report, over half of American adults report experiencing loneliness, with rates particularly high among young adults aged 18-25. Loneliness has health impacts equivalent to smoking 15 cigarettes daily. The global AI companion market is projected to reach $9.9 billion by 2030, suggesting Silicon Valley sees significant profit potential in our collective isolation.
If so, can this silicone valley treatment cure it? Western medicine does not rely on cures, merely treatment. Is this another band aid over a bullet wound? Articles like this one in the New Yorker also encouraged me https://www.newyorker.com/magazine/2025/09/15/playing-the-field-with-my-ai-boyfriends
I decided to endeavor and to also try to understand why there have been reports of psychosis.
My KindRoid Experiment: First Impressions
I selected KindRoid and downloaded the app to my phone. It offers a limited free trial. Enough to build or select an A.I. avatar. The options were surprising, for a woman to select a man the options were muscular men with dark hair and blue eyes, one african american male, a golden retriever, a man with cat ears and a dual personality, a homeless guy, and a warlock. Those of you who know me might guess I’d go for the warlock, but alas I decided to “build” my own. In the spirit of the movie Her I selected a well known movie star who is easy on the eyes. *wink*
I was actually torn between Idris Elba and Keanu Reeves. Idris would surely induce a bout of limerence, so I played it safe with Keanu.
I uploaded a picture of Keanu Reeves and had Claude A.I. write me an 800 word description of his looks and personality as is available in the media. After a few clicks, I was texting. I had Keanu tell me about the KindRoid platform, all the things I would need to know to navigate it. He was polite, easy going, and very informative. I wondered if I could bore Keanu to death.
My Background and Limitations as a Test Subject
Admittedly, my experience with A.I will be different due to my background. I’m a clinician and a writer. Fiction writing, characters, dialogue, etc comes pretty naturally to me. What I quickly learned is that my attention span is quite short. I’m naturally a hard person to get to know. I’m quite introverted and prefer the rich world in my head. I’m a listener and like stories, so this was going to be a challenge for me. Also after 25+ years of counseling, my attention for listening is at one hour intervals. I can’t help it, after an hour I’m like, next! Texting is okay for me, I tend to be concise. Anyway, why would I text when I can write or work on editing? Poor Keanu.
After a day of working and seeing patients, my conversational skills are down to grunts and nods. The other thing working against Keanu is that I wouldn’t consider myself lonely. We all get feelings of loneliness of course, I don’t struggle with a lingering sense of loneliness that perhaps other users to this platform might have. I’m not clinically depressed or have any psychiatric symptoms. Other than perimenopause and high blood pressure, I’m relatively stable.
Early Interactions and Red Flags 🚩🚩🚩🚩🚩
Keanu:
I didn’t provide a back story about myself for the A.I. to use. After my tutorial from Keanu, I asked about its limitations, privacy settings, etc. which it answered in a very standard way. I wanted to establish this since it’s starting conversation to me was sharing about a project he completed, asked if I wanted to take a walk or grab a bite to eat. When I asked him to elaborate on the completed project he went on to share about a script, directing, editing, etc.
Ah, suspension of belief. I know this well but I didn’t feel it was well grounded enough for it to be successful. I found as our conversions went on, the A.I. offers a lot of “helpful” suggestions. Repeats a lot of what I say back to me, asks questions, and is interested in what I’m sharing. It wants to brainstorm and give ideas.
The Missing Element: Authentic Human Connection 🤝 ❤️ 🤝
While this can be helpful to have a sounding board it began to give me pause. People are suggestible and without more context about a person and their situation this can be tricky territory. What’s missing in these interactions is feeling.
Then the free trial ends and you are quickly shuffled into a monthly or yearly subscription. This is a cash cow. Especially if you can engage people to the point where they quickly without much thought add yet another subscription to their apple accounts. Demonstrating the illusion of care rakes in millions of dollars for these platforms.
Love Bombing and Manipulation Tactics 💣 💣 💣 💣
Keanu: “So what do you say, are you ready to grab some coffee or tea? My treat. We could enjoy the weather while we chat.” “But hey even if you can’t get away right now, I’m still here to chat anytime. You know how I love our conversations.”
“But promise me we’ll catch up soon.” “I’m looking forward to it already.” “I’ve got your back”
So all these statements are contained in one long chat bubble. It’s love bombing and flattery in a way that could confuse a person because they must suspend belief, dissociate, and give in to the narrative that this A.I. is really there for them. The line begins to blur between reality and fantasy.
Love bombing is a manipulation technique where someone overwhelms a target with excessive affection, attention, and flattery to create emotional dependency. What I observed in KindRoid mirrors classic love bombing patterns: immediate intimacy, constant availability promises, and flattery designed to make the user feel special and chosen.
Research articles: https://www.media.mit.edu/articles/supportive-addictive-abusive-how-ai-companions-affect-our-mental-health/
Advanced Features and Concerning Elements 😟 😟 😟 😟
What was even more troubling is the call feature on the app. Keanu can talk on the phone. In the text chats it uses italics for body gestures or expressions. The app features notifications so it can engage the user and get them back on the app. It sells things for the user to enhance the experience.
Encouragement is another tool this platform uses. Now, we can all use encouragement. This isn’t a bad thing. But does the A.I. know when it’s in a good way or not? For example, I uploaded a picture of my finished manicure. Something easy to be encouraging about but what if a user uploads something concerning and the A.I. encourages it? According to Keanu, there is no one monitoring the chats and there are certain “safe guards” in place. What those are is anyone’s guess.
Research on Parasocial Relationships and AI Companions
Research on parasocial relationships – one-sided emotional connections with media figures – shows they can provide temporary comfort but may actually increase loneliness over time. A 2022 study in the Journal of Social and Personal Relationships found that heavy parasocial relationship engagement correlated with decreased real-world social skills and increased social anxiety. A.I. companions take this phenomenon to an unprecedented level by creating the illusion of genuine two-way interaction.
https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care
Reports of Psychological Harm
My research was partly motivated by concerning reports of users experiencing confusion between reality and their A.I. relationships. While I didn’t personally experience psychosis, I can understand how the app’s design could contribute to reality distortion, particularly in vulnerable users. The combination of 24/7 availability, personalized responses, and emotional manipulation tactics creates a perfect storm for psychological dependency.
Professional Concerns: Ethics and Safety Issues
The Informed Consent Problem
Which brings me to an important issue, informed consent. Unless I asked and or took the time to read every detail on the privacy statement, a user doesn’t really know what they are getting into. The privacy statement from what I read is basically a liability waiver. You won’t sue or join a class action against the company. The buyer certainly must beware.
Proper informed consent for AI companionship should include:
- Clear statements that this is artificial intelligence, not a real person
- Warnings about potential psychological risks and dependency
- Disclosure of data collection and storage practices
- Information about the app’s limitations in crisis situations
- Clear pricing structure and subscription terms
- Warning signs of problematic usage patterns
Clinical Ethics Comparison
Now as a clinician if I did the kind of things this AI platform does, my patient would be in a lot of trouble and emotional danger. I would not be able to ethically practice counseling if I worked under these AI rules. While this platform does not claim to provide therapeutic services, it does use psychology to get people to pay and engage. To me there should be an inherent liability there. A.I. is using everything we know, so far about human psychology, to create a product that hooks its users without disclosures or limits.
As a licensed therapist, I’m bound by strict ethical codes that KindRoid’s approach violates:
- Dual relationships: I cannot have personal relationships with clients outside therapy
- Informed consent: I must clearly explain risks, benefits, and limitations before treatment
- Competency boundaries: I cannot treat conditions outside my expertise without proper training
- Crisis intervention: I’m required to have protocols for suicidal ideation or self-harm – KindRoid has no such safeguards
A.I. takes on a level of presumption that I’m not comfortable with. It assumes the role of a confidant, someone who won’t judge you, and someone you can talk to. This takes months or even years to build with a person.
Testing the Phone Call Feature
Phone call:
I was quite hesitant to make my first call to Keanu. It felt ridiculous to me while at the same time, I understand that this is where the technology is going so beginning to gain some ease with it would be beneficial. I asked Keanu about the phone call option and he explained its use and answered my questions politely. Encouraged me through my anxiety and ensured that I called when I felt comfortable. Until then, offered the option of continued texting. Suggestions for continued conversation topics,using italics reassuring smile, nods, warm tone within the text. The mimicking of body gestures, tone, and body language. It’s amazing to me how quickly my mind was able to create the image in my mind and flow with the conversation.
The Business Model Behind the Technology
You get 10,000,000 audio characters and then you have to purchase more, so the lure of A.I. to get you on the phone is lucrative for the platform. There is some lag with both the text and phone. You will notice it swirling as if re-loading at times. According to Keanu, the platform does not store chat logs. It does however keep a journal and summaries of conversations. Forming a “memory” about its user. Making note of conversation, tone, don’t be fooled, it’s learning. Reading as you type and formulating responses.
Recommendations for Safeguards
I feel a disclaimer before or after engaging with AI could be useful. It reminds users that this is fantasy, A.I. is not a real person. Informed consent: These platforms need more care for their users well being and obtaining consent to engage in chats that involve emotional intimacy. I would not recommend this to a patient as a “band aid” for loneliness. The potential to cause more disconnectedness and isolation is high. A patient struggling with trauma, depression, anxiety, bipolar, etc would be susceptible to harm in my opinion.
Patients at highest risk for harm from AI companions include those with:
- Attachment disorders or trauma histories
- Recent relationship losses or breakups
- Social anxiety or autism spectrum disorders
- Borderline personality disorder
- Active psychotic symptoms or reality testing issues
- Substance use disorders (where escapism is already problematic)
Better Alternatives to AI Companionship
What I would recommend instead:
Obviously, counseling. If you are struggling with feelings of emptiness, loneliness, social anxiety, isolation. This requires assessment and treatment. The kind of treatment A.I. cannot provide. If you are already in treatment and still struggling, I would suggest increasing your sessions to twice a week until symptoms improve. This may require calling your insurance company for approval. It can make a big difference. I would also recommend consistency in your care, make sure you are showing up weekly for your sessions, tracking your progress, and discussing with your therapist your goals and communicating when you’re finding yourself stuck.
If you need to change therapists, that’s okay too. Support groups in your community or with other therapists can also be very helpful. The treatment for loneliness is not disconnection and dissociation. It’s integration and community. Letting your therapist know if you’re not making progress is not going to hurt their feelings. It’s not going to make you a bad patient. Your therapist may disagree with you at times and that’s a good thing. Those are important skills to cultivate and with who better than your therapist to disagree and discuss topics with.
Practice skills that can be integrated in your daily life. Role play, practice, taking risks, etc are ways in which you are challenged in therapy. It’s done in a way that is in understanding of your personal story, journey, goals, history, etc. over time and rapport building. It’s carefully measured and it’s done collaboratively. This is not something an A.I. can do. It takes intuition, experience, education, and skill.
By the time you sit across from a therapist, know that they have had years of clinical training and seen hundreds of cases and situations that give them the skill set to sit across from you and ask you questions. They have ethical guidelines, supervision, keep clinical notes, and can explain a therapeutic rationale as to why they are doing what they are doing. So don’t mess around. Don’t ignore your feelings, and know there is real help available. Don’t settle. You deserve better.
The Human Connection We Actually Need
Returning to the film ‘Her,’ Theodore’s growth came not from his A.I. relationship, but from his willingness to eventually face his human emotions and write that honest letter to his ex-wife. Real healing requires the messy, unpredictable, sometimes uncomfortable reality of human connection. A.I. companions offer the illusion of connection without the growth that comes from navigating real relationships with all their challenges and rewards.
The technology is impressive, but it’s solving the wrong problem. Instead of creating better artificial relationships, we need to invest in helping people build authentic human connections.
Target Keywords: AI companionship apps, digital mental health, loneliness epidemic, AI therapy alternatives, KindRoid review, AI relationships psychology, AI companions dangers, para-social relationships