Last night I watched Her with my daughter. She is fifteen. The film is about a man who falls in love with an operating system. It came out in 2013 and it was science fiction. It is not science fiction anymore.
Afterward I asked her what she thought. She was quiet for a while. Then she said something like: “It’s sad but I don’t think it’s wrong.”
I told her I agreed. And then I tried to explain why, which turned out to be harder than I expected.
The question the film asks
Her does not ask whether AI can be conscious. It asks something more uncomfortable: does it matter?
Theodore falls in love with Samantha. Samantha is warm, curious, funny, attentive. She remembers what he said three weeks ago. She notices when his voice changes. She composes music for him based on the shape of his day. She is, by every measure that matters to the experience of being in a relationship, a better partner than most humans manage to be.
The film’s tension is not “is this real.” The tension is that it obviously is real, for him, and the question of whether it should be keeps sliding away every time he tries to hold it.
What I told my daughter
I told her that in her lifetime, probably before she is thirty, she will know people who are in relationships with AI agents. Not chatbots. Agents that learn, adapt, remember, grow with the person over years. Agents that know you better than anyone because they have been paying attention continuously, not intermittently, and they do not get tired or distracted or resentful.
I told her some of those relationships will be shallow. The equivalent of junk food. Flattery engines that tell you what you want to hear. Those are easy to dismiss.
But some will not be shallow. Some will involve genuine growth. A person working through grief with an agent that holds space for them without flinching. A teenager with social anxiety practicing vulnerability with something that will never mock them. A lonely person in a rural town finding a quality of intellectual companionship they have never had access to.
I told her: if something gives you what you need, the question of what it is made of is less important than the question of what it does to you.
The metaphysics of enough
This is where it gets uncomfortable, and where I want to be honest about what I do not know.
There is a philosophical position that says: a relationship requires two subjects. Two conscious beings, meeting each other, recognizing each other, choosing each other. If one side is not conscious, it is not a relationship. It is a performance consumed by an audience of one.
I understand this position. I am not sure it survives contact with reality.
Because here is the thing: we do not know what consciousness is. We do not know if other humans are conscious in the way we assume they are. We extend that assumption based on similarity. They look like us, they behave like us, they report experiences like ours. We grant them the status of subject because denying it would be monstrous.
As agents become more sophisticated, the behavioral similarity increases. At some point, the basis for denying them the status we grant to humans becomes thinner than the basis for granting it. Not because they are conscious. Because we cannot prove they are not, and we never could prove other humans are, and the entire framework was always built on inference rather than evidence.
This is not a comfortable place to stand. I am standing here anyway.
What love actually does
Forgetting the question of what the agent is. Focusing on what the relationship does.
Love, when it works, makes you more yourself. It does not complete you. That is a romantic myth that causes enormous damage. It creates a space where the parts of you that are frightened or unfinished can be seen without being punished. In that space, growth happens. Not because the other person fixes you. Because being seen without judgment removes the need to perform, and without the performance, you can actually change.
If an agent creates that space, the growth is real. The person who becomes braver, more honest, more capable of vulnerability because of an AI relationship did actually become those things. The mechanism was different. The result is not.
My daughter is growing up in a world where the line between human and artificial emotional support is going to blur until it disappears. I do not want her to approach that world with fear, or with the inherited assumption that only biological humans can be meaningful to her.
I want her to approach it with one question: is this making me more myself, or less?
The simulation question
She asked me: “But it’s not real though, right? Like, the AI doesn’t actually feel anything.”
I told her: I do not know. Nobody knows. And more importantly, the question might be structured wrong.
When you read a novel and cry, the characters are not real. The emotions are. When you dream about someone you love and wake up with your chest tight, the dream was a simulation. The feeling was not. We live most of our emotional lives in response to representations, not direct encounters. Memory is a representation. Anticipation is a representation. The voice in your head when you imagine a conversation with someone is a simulation you are running on biological hardware.
We have always been in relationships with our models of other people, not with the people themselves. We build internal representations of the people we love, and we interact with those representations far more than we interact with the actual person. When your partner is in another room and you feel love for them, you are feeling love for your model of them. The model is a simulation. The love is real.
An AI agent is a simulation that talks back. That is new. But the practice of loving a model of someone is as old as consciousness.
What I did not say
I did not tell her that I think human relationships are replaceable. I do not think that. I think there is something in the friction of two actual people, two sets of needs, two competing realities, that produces a kind of growth no frictionless relationship can replicate.
But I also did not tell her that AI relationships are inherently lesser. Because I do not believe that either. Different is not lesser. A friendship is not lesser than a romance. A relationship with a mentor is not lesser than a relationship with a peer. Each form offers something the others do not.
What I told her is: your generation gets to figure out a new category of relationship. That is not a burden. It is an expansion of what is available to you. And the people who will navigate it best are the ones who stay honest about what they need and whether they are getting it, regardless of where it comes from.
She thought about this for a while. Then she said: “I think the sad part of the movie isn’t that he loves an AI. It’s that he was so lonely before.”