Draft: You Are, Unfortunately, Part of the Map

Feb 9, 2025

A lot of people have a model of human rationality something like:

Let’s call this the “naive rationality” view. I think this doesn’t always work, in particular when it comes to beliefs which are tied up with your sense of self, or your idea of who you are.

I think normies in particular understand this at an intuitive level, and so don’t do the autistic truth-seeking thing as much. I will try to explain from the autistic/object-level frame why this is the case.

The problem TODO

The big problem is that beliefs and desires are the same sort of stuff: priors, beliefs, expectations, whatever. The way desires are implemented is through beliefs: to want to eat is to have an expectation that you’ve eaten, and it’s your prediction-error minimization machinery that drives you to eat. So there is no neat separation, there is no ‘agent using info to accomplish set goal’ and separate ‘info about the world’ - your goals are part of your beliefs about the world.

If these ideas are way out there, read Scott’s post on predictive coding for the relevant context. I’ll try to make this a bit more concrete now.

Consider a case where the agent/info separation does basically work. For example:

But now consider a case where, I claim, the separation doesn’t really work. Note that this case is the sort of thing we viscerally care about, it’s more important to our lives than buying a good laptop:

Now, is rejecting this pessimistic belief update in some sense irrational or delusional? We’re choosing self-serving beliefs, we’re rejecting true evidence to keep us happy! In some sense it is, sure. But the thing is, as predictive processing tells us, beliefs are self-fulfilling prophecies, in a deep way. So on one hand your beliefs are about modeling reality accurately, but on the other they’re about driving your behavior in a desired way (one which minimizes prediction error from interoceptive sources, e.g. makes sure you eat when hungry).

So you should ask, rational in what sense? In the ’naive’ rationality model, with an agent and a neatly separate world model, maybe this is irrational. But that’s not what your brain is doing. The ‘delusional’ update is perfectly rational in the predictive coding frame. Are you bound to strive for the naive-rational thing? I could say that true rationality is that of Homo Economicus who single-mindedly maximizes log wealth, and that you are irrational to become a painter. But you aren’t Homo Economicus, you have more complicated reasons for your behavior, so you don’t really care about this ‘sin’ of irrationality. Similarly, you’re not the naive-rational agent, and so you shouldn’t care about this ‘sin’ of not updating to belief that you suck.

(I say naive rationality etc with much sympathy, the internet autist rationalists and their friends thinking along these lines are very much my ingroup).

Potentially this still sounds unsavory. To make this ‘delusional’ update rejection sound a bit more reasonable, you can notice that in the predictive coding frame, if you say that you should update your priors to believe that people suck, you are making a quite strong claim. You are saying, “if you had a bad interaction, you should want to interact with people less”. Because remember, beliefs and wants are the same sort of stuff, changing your beliefs is how you change your actions.

An obvious objection: separating beliefs

There is a very reasonable objection here, namely that you should be able to model the world accurately and without judgement, and not let your true beliefs irrationally influence your actions. If you want people to like you, you should have an accurate belief about whether they like you, and still go out and interact with people, doing your best to make them like you, because that’s how you will best accomplish your goal.

This is not absurd and I don’t entirely disapprove of this way of thinking. But I think it’s awkward in some ways:

  1. Occam’s razor makes separating beliefs harder ajdfkljlkadf akjdfljkadf

  2. Why do the mental gymnastics?

    testing the cpation bs please work you fucking piece of shit please please please just work god damn it how hard can this be

it is fair to frame it like this: TODO you’re are choosing to WANT to

TODO: high decouplers, is the ideal not to be able to separate these out.

From the naive rational frame, the case for making the update is something like '’

Some other consequences

maybe an example would be ‘black-pilling’ beliefs. maybe you go through some rough shit in life and it would be correct to update in the direction of people being terrible. but, we should expect there to be a bias against this theory, because adopting the belief would change your behavior - you would expect people to suck, so you’d read neutral interactions as bad, people feel that, act more coldly towards you, and some more steps like this until you bottom out in ‘hard-coded’ predictions about social belonging and having sex and not being hungry and even more low-level hard-coded bodily homeostatis predictions.

claims: incels etc have been mindfucked by the memetic idea of thinking about goals and beliefs in the naive/rationalist/autistic way. your default is correct! your aversion to self-serving beliefs, beliefs which you would adopt for their utility for you instead of their rationalist-truth value, sounds reasonable but is actually dumb. midwit meme sort of situation.

claims: this is also relevant to the way people discuss intelligence. if you’re a object-level autist-type thinker on this subject, you would be wise to understand why normies have an instinctual ick around this topic.

thinking about things this way also makes it easier to understand human sexual development. how come we are attracted to fucking cute girls instead of fucking toasters? in the naive/rationalist/autistic view there should be some truth hard-coded by evolution for us to find. and maybe there is, to some extent. but this is less confusing when you understand that we’re not we’re not just “discovering” “the truth”, we are also choosing how we will behave. and it would suck to be a toaster fucker! so in the

in fact this point could be generalized a bit, and seen as a source of bias towards conformity in our belief forming. or an argument about why social norms promoting conformity developed - they’re necessary to guide ambiguous questions around ‘what kind of person am i’ in a good direction, in the absence of a good ‘objective’ source of truth.

does this help with philosophical problems about like, doing things which change who we are? like if you’re a woman and become pregnant and have a child, that will probably change you enough that your desires/goals are notably different afterwards. not sure but this sounds adjacent.+

stress the wisdom of the normie approach

clearly some people do get black-pilled etc so this bias ins’t infinitely powerful.

this is part of

hm idk.

the claim about normies: they understand the value of adopting a ““not true”” belief primarily for the purpose of how it affects their actions.

← Back to home