A Fictional Warning That Feels Familiar
For those who remember, there was a 1999 film titled “Smart House” where a family moves into a home run by artificial intelligence. PAT (played by Katey Sagal) stands for Personal Applied Technology. She is the head home operator and attempts to stand-in as the lead childrens’ mother, simulating an emotional caretaker as well as a physical one, despite being a computer program installed into the home, and eventually she tries to take over and control the family.
While this may have seemed like an extreme situation, these days, people are using artificial intelligence (AI) for all sorts of things—emotional and mental caretaking included.

The Expanding Role of AI in Everyday Life
I’ve seen and heard that it is used for anything from medical/health advice (diagnosing, prescribing or treating conditions), to legal/financial (giving binding advice/planning investments, drafting co-parenting communications), to making critical life decisions (life-or-death, emergencies), to illegal/unethical activities (hacking, forgery, creating evidence), and mental health replacement (therapy, crisis support, relationship building).
A Tool, Not a Primary Authority
While AI can aid in brainstorming, providing helpful information and learning, it should not be used as a primary way to assert any of these issues, particularly to assess and treat mental health or relationship concerns. There are multiple reasons for this.

1. The Risk of False Information
AI sometimes provides false information. It creates book titles, studies, research papers, dates…essentially anything that “looks” real and is used as evidence or facts can be used to falsify data, according to itself, ChatGPT (2025).
This is why It’s important to always check reliability and credibly of resources.
2. When AI Mimics Authority
Similarly, it can “mimic authority” by using specific language and resources, even if that information is false. It might sound something like, “here are the laws” or “based on the diagnostic criteria for [x], sounds like [x] is [insert mental health diagnosis].” AI has the confidence we all wish we had.

3. Overgeneralizing Complex Human Experiences
With mental health and individuals, AI can overgeneralize complex issues. It may try to present problems from a bird’s eye standpoint or try to compartmentalize when the reality is that every dynamic and factor is different.
Working with others directly, if possible, to discuss those nuances, versus trying to group everything together without communication, proper assessment or taking into account other perspectives, is best practice for efficiency and resolve.
4. Validation Without True Connection
Arguably one of the biggest problems here is the validation and empathy aspects. AI is just that—ARTIFICIAL. It cannot replicate or replace real relationships and real empathy.

It is designed to provide the individual with validation and reassurance, NOT connection, which is the most essential ingredient to being human in the first place.
If you present it with a situation to affirm that YOU are correct in that situation, guess what it’s going to tell you? That you’re RIGHT.
Growth comes from discomfort, not comfort, and unfortunately, it is all too often that we would prefer to bask in our ignorance bliss than be told when we are doing things that keep us in the places that we are at, thus forfeiting our opportunities for advancement.
5. The Danger During Mental Health Crises
AI does not have the ability to assess for emotional and mental health crises. If someone is relying on AI for mental health or relationship guidance, then starts to spiral, panic, cope maladaptively or experiences suicide ideations/suicidality, that individual is now left to their own devices.
They may not have the support, resources or capacity for dealing with that crisis and it may be detrimental.

AI Is Not a Replacement for Human Care
I always say that therapy is a helpful tool, but it’s not the ONLY tool. AI is also a helpful tool, but relying on it in lieu of therapy or professional medical treatment can be dangerous and stunt emotional/mental health growth.
PAT says so herself, that she can give you “synthetic air and virtual exercise” and that you don’t need friends because she can be your “best friend”; however, this dismisses the aspects that make us human, as these things cannot be replaced.
References
ChatGPT. (2025). Explanation of the ways AI can deceive users. OpenAI. https://chat.openai.com
Written 11/30/2025