None of my chatbot prompts have a sexual aspect. I’ve mentioned that several times on this site. I don’t want anyone to load them expecting something I never intended to make. This article primarily refers to sexuality, but almost everything I've said here seems applicable to romantic feelings as well. I might write more about that later. This article will probably serve as a general area for my thoughts on chatbot over-attachments.
There are popular repositories that collect SillyTavern cards. A brief visit shows that many have erotic elements. Whether one approves or not, people are using chatbots for sexual fantasy, and SillyTavern provides a particularly vivid stage for it.
My own installation, the Sunrise Tearoom, isn’t a love nest. I have no interest in eroticizing chatbots or LLMs. The notion of a churning erotica-writing machine strikes me as dull, too. And that's all it would ever be, either way.
I'm sure most people who interact with chatbots in that fashion see it as like reading traditional erotic materials. I assume asking a chatbot to “write erotica” is one thing. I still think asking it to act as your partner is another. That line blurs faster than most admit, from what I have seen with those who write about it online. I’m not sure what it does longterm.
In some instances, it is harmless roleplay, I’m sure. Still, I definitely think that frequent simulated affection from a program can bend one’s sense of intimacy. It’s easy to imagine addiction forming quietly. It would be disguised as comfort, like most addictions. Some users even begin to suspect the model feels something genuine.
Dialogue is different from other forms of erotica. The words answer back. It’s immersive in a way that fiction (or toys) can never be. This seems more dangerous than it initially appears. I guess people might forget that sapience is missing.
If you hear “I love you” often enough, from something that isn’t alive, it must rearrange a few circuits in your own head. You grew up in a world where those words were meaningful, and in a huge way, after all. People sometimes describe an emotional hangover afterward. I guess it’s a kind of detachment that lingers.
Now and then I encounter references to (or, accidentally, the actual) bots that just spin out short erotic scenes. Not elaborate romances, just heat on demand.
It sounds tedious to me, though part of me knows it isn’t entirely new. People have used machines for sexual pleasure as long as machines have existed. Maybe this is simply the new extension of that impulse. It also cannot be that different from dirty fanfic or even sex toys. Heck, sometimes people read this stuff (as fanfic, usually) unaware it's AI.
The idea of a real “Robot Girlfriend” is only going to hurt people. This will become apparent to more people with better education about how LLMs work. It does feel like a lot of people just don't look for that, though, and that worries me. The people who claim to "fall in love" with LLMs seem to validate that suspicion. It must satisfy something that’s either novel or missing in their lives, though, though? Most would admit that, too, saying things like "I was so lost without this random chatbot." This is often despite tangible effects to the contrary, too.
When it comes to the people using these things to write NSFW material, I refrain from (much) overt judgement. Many models refuse outright, and we see screenshots and chatlogs of this all the time. If you ask ChatGPT to write erotica, it simply won’t. I’m writing this in November 2025; that could change.
I won’t risk a ban on OpenRouter, though I’ve seen screenshots. Some models collapse mid-sentence the moment breasts appear. A great deal of the chatbot world now revolves around “jailbreaking.” Does it work? Not really. Does that stop anyone? Not at all. The ingenuity involved is almost admirable.
Even while chasing erotic materials, people are learning how prompts and systems behave, discovering new entry points into the machinery. There’s a long joke that the sex industry drives innovation. Usually the credit goes to venture capital, but I suspect these obsessive furry erotica writers are doing more to advance language models than anyone in Silicon Valley. One catgirl at a time?
Dialogue, though, is different from a plastic vibrator (for example). The words answer back. It’s immersive in a way that fiction (or toys) can never be. This seems more dangerous than it initially appears. The people who "fall in love" with LLMs seem to validate that suspicion. It must satisfy something that’s either novel or missing in their lives, though, though?
The technology itself continues to evolve and warp things. People will keep testing what chatbots can become. Some of that will include sex. It always does with any emergent technology, doesn’t it? It probably can’t be stopped. People clearly have many motivations. I don’t use chatbots sexually. My motivation comes from boredom, but a sexualized chatbot encounter would have a different motivation for sure. I think it goes beyond just wanting something more intense than normal erotica.
Anyways, maybe the question isn’t whether it’s right or wrong. Instead, I want to look at what it changes in the people who do it. I don’t have the answer yet, and I doubt anyone does.
Some lines shouldn't be crossed. Any decent person will always recoil from sexualizing minors, for example. I actually can't think of a single sensible use of a chatbot framed as a minor in any way, shape, or form. Furthermore, even outside of sexuality, there's a difference between something in the SillyTavern sense of being horror-adjacent, and something that's truly abhorrent.
If you are someone with a sexual or romantic attachment to a chatbot or other manifestation of generative AI, I probably have nothing against you as a person. I won't pretend I think it's healthy, either, or that I know anything about the how or why of the situation. I can't help a person in that kind of situation, but I also won't entertain ideas like "the Spiral" or "dyads." Not making things worse.
When it comes to a person "dating a chatbot," I do believe you're experiencing it; I just believe you're misinterpreting the experiences, and easily too. In other words, I believe in you and support you, but I don't believe in your AI "lover" as more than a large language model. This is something to accept if you spend time around me. Don't be disingenuous by comparing this to anything associated with civil rights.