Sexbots, Plato, and Jung

Apparently today is the day for sexbot discourse.  Joy of joys.


So…why does anyone want a sexbot?  What’s the value-add in this proposition?

There’s obviously a wide spectrum of possibilities here, but I think we can usefully divide them into three conceptual buckets. 


On the one end, there’s the pure straightforward object-oriented desire for a better sex toy with cooler features.  “Give me a vibrator/fleshlight, but, like, mobile, with arms and legs and a face and stuff, because that’ll make the orgasms better somehow.”  

To the extent that anyone is thinking this, I have zero trouble saying that there’s absolutely no objection to it at all that carries any water.  Go and get the best tool for the task.  Have fun. 

…on the other hand, an honest assessment will compel us to admit that basically no one will be thinking this.  Sex is mostly mental and emotional for pretty much everyone, the things we want out of it are mostly about complicated deep-laid psychological stuff – and to the extent that it really is just about pushing physical pleasure-buttons, existing technology has that covered just fine.  This is kind of a strawman, and I’m mentioning it only for the sake of thoroughness. 


All the way on the other end, you get a number of variations on “I want a sexbot so that I can fool myself into believing that it’s a person with whom I can have a relationship.” 

(A few of those variations entail “…so that I can fool myself into believing that it’s a person with whom I can have a cruel/abusive relationship, one that for moral or practical reasons I can’t get with a real person.”  But only a few.  I’m not going to discuss them separately; I think we’ve had quite enough of that particular sub-discourse.)

Some people actually will think this; some already do.  In particular, if your conscious mind has become so soured on relationships (or so soured on the-opposite-sex-as-a-whole) that you believe them to be worthwhile only for the sake of fulfilling extremely simplistic psychological needs, you might be inclined to think that a non-sapient robot with a good user interface – something like a current-tech video game NPC with a meatspace body, let’s say – could fill the role of a human partner without much being lost. 

This is not a correct or healthy thing to think, and anyone whose mind is on this track is going to be painfully disappointed by the reality of having a sexbot. 

This is true for a lot of super-obvious reasons that boil down to “people are intellectually and emotionally generative, the value of being close to them mostly involves getting to interact with their complicated thoughts and feelings, the sexbots we’re talking about will not give you any of that.”  It’s also true for some slightly-less-obvious reasons.  A lot of what people want out of relationships, a lot of the thing whose absence actually drives lonely people to madness and despair, is social validation – the validation of having someone (especially someone with a high social value) think that you’re worth caring about, the validation of everyone around you thinking that you’re cool or mature or successful or whatever – and none of that can be faked, even if right now you feel like you’d be totally happy to settle for the external trappings. 

For whatever it’s worth, I also do agree with @jadagul that fooling yourself in this way is Unvirtuous, independent of any utilitarian fallout of any kind. 

So I’m happy to say that using a sexbot, for this particular kind of reason, is probably bad for you and you probably shouldn’t do it.  That in itself is not a good enough reason to make policy, we allow all sorts of bad things into society because trying to enforce a ban would be much worse, but it’s a judgment. 


But everything I’ve said thus far is kind of pointless, because the vast majority of the world’s desire-for-sexbots would in fact fall into the third bucket, which sits in between the other two. 

OK, our first Weird Philosophical Analogy: Plato’s tripartite soul.  You’ve got your semi-physiological animal appetite soul, you’ve got your seething subconscious emotional psychological soul, and you’ve got your conscious intellectual soul that contains your actual personality and goals and ideas.  In your “average” “normal” person, all three of them are united in strongly wanting sex.  But that desire means totally different things to each of them. 

The appetite soul can be satisfied with a vibrator or fleshlight.  The intellectual soul definitely needs another real person, someone who can constantly feed you you new thoughts and cause you to grow, someone who can be a part of your life and contribute things, no substitutes accepted. 

[I think that, in modern parlance, a person whose appetite soul doesn’t have that kind of need is called “asexual,” and a person whose intellectual soul doesn’t have that kind of need is called “aromantic.”  But maybe that mapping doesn’t work?  Discuss.]

The “emotional soul” – which is a terrible name for it, but there isn’t a better one in modern language, which has lost the semantic distinction between nefesh/psyche/soul and ruach/pneuma/spirit, thanks, Church Doctors – is roughly akin to the subconscious mind of the Old Psychologists, although you certainly can be aware of its workings under many circumstances.  It’s the part of you that cares about feelings and social cues in an unreflective way, much as the appetite soul cares about sugar and temperature and orgasms.  It’s the part of you that cringes when you feel shame, without any consideration of whether that shame is endorsed or desirable or appropriate.  It’s the part of you that crows like a rooster when some stranger likes your post on social media. 

The emotional soul cares almost exclusively about social, cultural, and emotional things, but…it doesn’t actually care about people, not in any sense that a thinking intelligence would find meaningful.  It doesn’t understand their existence as beings with interiority; that requires abstract thinking, which is not a thing of which it’s capable.  It doesn’t care about who they are or what they want.  It cares only about what they do, in a very direct and concrete kind of way, because human actions line up with the happy-patterns and sad-patterns that it does understand.

The emotional soul has a lot of use for a sexbot. 

An easy and not-very-loaded example: when you are despairing and full of doubts, it can be very comforting to have a beautiful-person-shaped-entity giving you the basic reassurances that you would otherwise have to give yourself.  Today, in our sexbot-free world, this usually translates to “it’s nice to have a loved one comfort you in such a way” – but in fact your loved one’s existence as an independent thinking entity isn’t providing very much value-add in this particular scenario.  You already know the words in question, it’s not like you need someone else to generate them.  And, let’s be honest, your loved one is going to say those things pretty much no matter what he’s actually thinking in the moment, it may be so much a ritual courtesy that his not saying the words would be a hurtful surprise.  And yet it helps, perhaps quite a lot, because there’s a sub-rational part of you that doesn’t have declarative beliefs but knows that it likes seeing someone pretty say the nice words. 

A much-more-loaded example: many sexual fetishes.  No beliefs of any kind involved, no caring about anyone’s interiority, just some part of your mind that likes seeing someone pretty do the thing.  It’s a happy-button; maybe it has its origins in some interpersonal emotional complication, but at some point your psyche contains an independent module that’s just “button push ==> happy.”  And a sexbot can be pretty, and do the thing, just as well as a person with hopes and dreams. 

(I am pretty confident that, in a world with actual sexbots worthy of the name, a big slice of the sexbot-buying population is going to consist of couples interested in doing Group Sex Acts without any of the complications attendant on involving actual other people.) 


So OK.  Evaluation time.  What happens if this actually takes off?

A bunch of people get their emotional-soul needs met without having to rely on other human beings to do it for them.

This is potentially a very good thing. 

You can say “it will allow a bunch of lonely people who can’t find partners to satisfy more of their needs than they could otherwise,” which is true, but in fact it’s the least of it.  It could change the fundamental dynamic of human romantic relationships for those people who are capable of finding them.  It could allow them to be less driven by raw psychological need.

We’ve never actually relied on our partners for our appetite-soul needs; if you’ve got hands, you can probably find your way to an orgasm.  But we rely on our partners, extensively, for our emotional-soul needs.  We demand that they do the thing, whatever the thing is, because we need a person-shaped entity to fulfill that function or we get anxious and depressed.  We need them to play their assigned roles in our sex rituals and our comfort rituals and so on. 

If we have convincing person-shaped-entities without interiority that will just do whatever we want, then we can slot them into our rituals.  And maybe we can have a little more respect for each other as independent people, and approach each other in more of a spirit of exploratory appreciative wonder, and mutilate each other a little less in the name of creating the supportive partners we need.


A sentence you won’t hear very often these days, for good reason: I think it is helpful to think about this concept through a Jungian framework.

As Jung would have it, one of the important parts of your mind/soul/whatever is your anima if you’re a “normal” straight man, or animus if you’re a “normal” straight woman.  (There have been lots of arguments over how this works if you’re not doing the standard binary heterosexual thing, I’m not getting into it now, just…roll with it.)  The anima/animus is a sort of internal princess/prince figure, the living Grail at the end of the sacred self-development quest, containing within itself all the aspects of you that seem foreign and impossible-to-understand and not-quite-part-of-yourself.  The muse who brings inspiration, the voice of solace and comfort in the depths of depression, etc.  It’s represented as an idealized lover because it is all the things for which you reach out to the world in an attempt to feel complete.  But it’s all there inside. 

Achieving union with your anima/animus, in the Jungian scheme, is a key step of becoming a whole and happy person.  Without that internal union, you try to force other human beings into the role; it never works, and it does lots of damage to both parties in the process. 

I don’t know whether projecting aspects of your anima or animus onto a sexbot is a good way of coming to terms with it “properly.”  But I’m damn sure that I’d rather you do that, and seek out your private hieros gamos in a psychological mirror made from silicone, than dragoon an actual person into the job of making you complete. 

A lot of bad relationships – and a lot of bad parts of good relationships – are that second thing.  People feel so much desperate need for one another, because they feel so broken.  But love works a lot better when you go into it whole.