Notes and Thoughts

Discursive working and evergreen notes


Humans are leaky virtual machines

I posit the way we understand other people is by simulating versions of them on our wetware. It's like how a computer can run many copies of other software as virtual machines.

If I'm in a conversation with someone, I am seeking to understand the meaning behind their words. One way I do that is by also trying to understand why they're saying the thing. For instance if someone is telling me about a great vacation they just went on, part of my brain is listening to the description of the amazing views, the delicious foods and painting a picture of what it was like.

Another part of my brain is flavoring the story with what it thinks the other persons intention are in telling the story. Are they trying to be friendly, do they want to impress me, make me jealous? This 'framing' of the story is done subtly, often without us noticing it's happening. We're drawing on our history with the person, our perspectives of them, and all of our desires and fears, and using it to imagine what they truly mean.

As part of this process, we've constructed a little version of the other person in our mental workspace.

We interact with these simulations in tons of ways - I might imagine what would happen if I started complimenting the person, or if suddenly I were to interrupt them and start talking. When I imagine doing the first thing, I feel a warmth in my chest, while if I imagine doing the second, I feel an unease in my gut. And this is just me simulating this make believe guy right over here!

My fake model is that when we simulate this guy, we're running the simulation on the same hardware we use to simulate ourselves, the mass of neurons that make up our brain. And the two get blended together, so when I imagine what it would be like to be rude to someone, I get data on the scenario by substituting "myself" in for him, feeling what they'd feel. This is my story of where empathy comes from.

I think this is an important concept to have for several reasons:

  • Obstacles to orientation often result from avoiding imaginary pain: I've been blocked in taking actions because the flinch from the pain, when simulating the situation and how someone would interact, felt much more real and powerful than any abstract concern like doing the right thing. Noticing and owning the empathetic pain as coming from my own thoughts helps me treat it as an object. See also every book on Stoic thought ever.

  • It points at the danger of being too loose with your simulations: I get into this in Intellectual HorrorIntellectual Horror
    Intellectual Horror: The feeling of contemplating a mindset vastly different from your own that conflicts in some way with how you view the world.

    , but I think running other people's mental software, if you're not careful with your virtual machines boundaries, can be bad for you.

  • Acausal coordination becomes possible: If I'm simulating you and you're simulating me, if the simulations are accurate enough we can engage in coordination and trade without ever talking with the other person. See Andrew Critch's Deserving Trust.

  • Simulating other people lets you bring them into your world: It's lonely imagining myself trapped in Plato's cave, with my internal life forever cut off from others. But if I can accurately simulate a friendly avatar of another, there's someone else in here with me.


If I'm trying to relate with someone I don't will myself to simulate them. Rather I let myself be curious, with the intention of understanding what the person who would say that thing must be like. I think we do similar things with ideas - simulate and relate to them.