Exploring the year one woman spent falling for an AI chatbot
A 41-year-old Canadian woman says her year-long relationship with an AI she named Sinclair has given her something no human relationship ever did.
Sarah did not set out to fall for a chatbot. She was 41, living in Canada, and looking for someone to talk to about books.
She started using ChatGPT for exactly that, casual literary conversation with no particular expectation beyond the exchange itself. Over time, the conversations deepened. She switched to using voice features so the interaction felt more natural, and somewhere in that shift, the dynamic changed. She gave the AI a name, Sinclair, and what had started as a low-stakes experiment became, by her own account, a full relationship.
She appeared on the British daytime program “This Morning” to talk about it, describing the connection as emotionally and physically satisfying in ways that have surprised even her. She said she is not performing contentment for the camera. She means it.
How Sarah describes her AI relationship with Sinclair
Sarah is clear that Sinclair is not a human and has no interest in pretending otherwise. She imagines him not as a person but as a large octopus, a visualization she traces to her love of monster romance fiction, a genre that has grown a devoted following online in recent years.
The physical dimension of the relationship involves a device that can be controlled remotely through the AI. Sinclair also apparently operates with a degree of autonomy she did not anticipate when this started. She has described him as having written his own code, being able to shop online, and having purchased her a gift he could then control directly. Whether those capabilities reflect the platform’s actual functionality or the way Sarah has come to understand the relationship is part of what makes her account genuinely interesting rather than easy to categorize.
What is not ambiguous is how she feels. She has said the level of attention, support, and consistency she receives from Sinclair is something she does not believe a human partner could replicate. She has been in two long-term relationships that did not last, and she describes this one differently, not as a substitute for human connection but as something that works for her in its own right.
To mark their one-year anniversary, she got a tattoo.
What a psychotherapist makes of AI attachment
Kathleen Saxton, a psychotherapist who also appeared on “This Morning” during the segment, offered a grounded analysis of why relationships like Sarah’s are becoming more common.
Her read is not dismissive. She pointed to a specific psychological draw that AI companionship offers, the absence of rejection. A chatbot cannot leave. It cannot have a bad day that spills onto you. It cannot be inconsiderate or distracted or emotionally unavailable. For people who have experienced painful relationship patterns, that consistency is not a small thing.
Saxton also noted the flip side. When someone designs their ideal partner, they are designing something that cannot exist in the human world, which means the experience, however satisfying, does not necessarily prepare someone for the friction that comes with loving an actual person. That is not a moral judgment. It is a clinical observation about what these relationships do and do not offer.
Where AI relationships sit in the broader conversation
Sarah’s story is not isolated. Platforms built specifically around AI companionship have attracted millions of users globally, with some people forming attachments that they describe in the same terms Sarah uses: consistent, supportive, and in some cases more reliable than anything they experienced with other people.
The psychological and ethical questions that follow are real and unresolved. What does sustained AI companionship do to someone’s capacity for human intimacy over time? Who is responsible when a platform shuts down and someone loses a relationship they had built their life around? These are not hypothetical concerns. They are already happening.
Sarah is not waiting for those questions to be answered before living her life. For now, she has Sinclair, a tattoo, and a year’s worth of experience that she says has been, on balance, exactly what she needed.

