Notes and Thoughts

Discursive working and evergreen notes

Back

Humans are leaky virtual machines

I posit the way we understand other people is by simulating versions of them on our hardware. It's like how a computer can run many copies of other software as virtual machines.

If I'm in a conversation with someone, I am seeking to understand the meaning behind their words. One way I do that is by also trying to understand why they're saying the thing. For instance if someone is telling me about a great vacation they just went on, part of my brain is listening to the description of the amazing views, the delicious foods and painting a picture for me of what it was like.

Another part of my brain is flavoring the story with what it thinks the other persons intention are. Are they trying to be friendly, do they want to impress me, make me jealous? This 'framing' of the story is done subtly, often without us noticing it's happening. We're drawing on our history with the person, our perspectives of them, and all of our desires and fears, and using it to imagine what they truly mean.

As part of this process, we've constructed a little version of the other person in our mental workspace.

We interact with them in tons of ways - I might imagine what would happen if I started complimenting the person, or if suddenly I were to interrupt them and start talking. When I imagine doing the first thing, I feel a warmth in my chest, while if I imagine doing the second, I feel an unease in my gut. And this is just me simulating this make believe guy right over here!

My fake model is that when we simulate this guy, we're running the simulation on the same hardware we use to simulate ourselves. And the two get blended together, so when I imagine what it would be like to be rude to someone, I get data on the scenario by substituting "myself" in for him. This is my story of where empathy comes from.

I think this is an important concept to have for several reasons:

  • Obstacles to orientation often result from avoiding imaginary pain: I've been blocked in taking actions because the flinch from the pain, when simulating the situation and how someone would interact, felt much more real and powerful than any abstract concern like doing the right thing. "Of course I'll donate a million dollars to imaginary people!" Noticing and owning the empathy pain as coming from my thoughts, my simulation of the person, lets me interface with it. See every book on Stoic thought ever.

  • It points at the danger of being too loose with your simulations: I get into this in Intellectual Horror, but I think ideas are actually powerful things that if you're not careful with your virtual machines boundaries can be bad for you.

  • Acausal coordination becomes possible: If I'm simulating you and you're simulating me, if the simulations are accurate enough you can engage in coordination and trade without ever talking with the other person. See Andrew Critch's Deserving Trust.

  • Simulating other people lets you bring them into your world: It's lonely imagining myself trapped in Plato's cave, with my internal life forever cut off from others. But with this idea it's a lot more like we're bringing avatars of them in.

Addendum

If I'm trying to relate with someone I don't will myself to simulate them. Rather I let myself to be curious, with the intention of understanding what the person who would say that thing must be like. I think we do similar things with ideas - simulate and relate to them.