[epistemic status: personal view of the rationality community.]
In this “post”, I’m going to outline two dimensions on which one could assess the rationality community and the success of the rationality project. This is hardly the only possible break-down, but it is one that underlies a lot of my thinking about rationality community building, and what I would do, if I decided rationality community building were a strong priority.
I’m going to call those two dimensions Culture and Mental Habits. As we’ll see these are not cleanly distinct categories, and they tend to bleed into each other. But they have separate enough focuses that one can meaningfully talk about the differences between them.
Culture
By “culture” I mean something like…
- Which good things are prioritized?
- Which actions and behaviors are socially rewarded?
- Which concepts and ideas are in common parlance?
Culture is about groups of people, what those groups share and what they value.
My perception is that on this dimension, the Bay area rationality community has done extraordinarily well.
Truth-seeking is seen as paramount: individuals are socially rewarded for admitting ignorance and changing their minds. Good faith and curiosity about other people’s beliefs is common.
Analytical and quantitative reasoning is highly respected, and increasingly, so is embodied intuition.
People get status for doing good scholarship (e.g. Sarah Constantin), for insightful analysis of complicated situations (e.g. Scott Alexander, for instance), or for otherwise producing good or interesting intellectual content (e.g. Eliezer).
Betting (putting your money where your mouth is) is socially-encouraged. Concepts like “crux” and “rationalist taboo” are well known enough to be frequently invoked in conversation.
Compared to the backdrop of mainline American culture, where admitting that you were wrong means losing face, and trying to figure out what’s true is secondary (if not outright suspicious, since it suggests political non-allegiance), the rationalist bubble’s culture of truth seeking is an impressive accomplishment.
Mental habits
For lack of a better term, I’m going to call this second dimension “mental habits” (or perhaps to borrow Leverage’s term “IPs”).
The thing that I care about in this category is “does a given individual reliably execute some specific cognitive move, when the situation calls for it?” or “does a given individual systematically avoid a given cognitive error?”
Some examples, to gesture at what I mean
- Never falling prey to the planning fallacy
- Never falling prey to sunk costs
- Systematically noticing defensiveness and deflinching or a similar move
- Systematically noticing and responding to rationalization phenomenology
- Implementing the “say oops” skill, when new evidence comes to light that overthrows an important position of yours
- Systematic avoidance of the sorts of errors I outline my Cold War Cognitive Errors investigation (this is the only version that is available at this time).
The element of reliability is crucial. There’s a way that culture is about “counting up” (some people know concept X, and use it sometimes) and mental habits is about “counting down” (each person rarely fails to execute relevant mental process Y).
The reliability of mental habits (in contrast with some mental motion that you know how to do and have done once or twice), is crucial, because it puts one in a relevantly different paradigm.
For one thing, there’s a frame under which rationality is about avoiding failure modes: how to succeed in a given domain depends on the domain, but rationality is about how not to fail, generally. Under that frame, executing the correct mental motion 10% of the time is much less interesting and impressive than executing it everytime (or even 90% of the time).
If the goal is to avoid the sorts of errors in my cold war post, then it is not even remotely sufficient for individuals to be familiar with the patches: they have to reliably notice the moments of intervention and execute the patches, almost every time, in order to avoid the error in the crucial moment.
Furthermore, systematic execution of a mental TAP allows for more complicated cognitive machines. Lots of complex skills depend on all of the pieces of the skills working.
It seems to me, that along this dimension, the rationality community has done dismally.
Eliezer wrote about Mental Habits of this sort in the sequences and in his other writing, but when I consider even very advanced members of my community, I think very few of them systematically notice rationalization, or will reliably avoid sunk costs, or consistently respond to their own defensiveness.
I see very few people around me who explicitly attempt to train 5-second or smaller rationality skills. (Anna and Matt Fallshaw are exceptions who come to mind).
Anna gave a talk at the CFAR alumni reunion this year, in which she presented two low-level cognitive skills of that sort. There were about 40 people in the room watching the lecture, but I would be mildly surprised if even 2 of those people reliably execute the skills described, in the relevant-trigger situation, 6 months from that talk.
But I can imagine a nearby world, where the rationality community was more clearly a community of practice, and most of the the people in that room, would watch that talk and then train the cognitive habit to that level of reliability.
This is not to say that fast cognitive skills of this sort are what we should be focusing on. I can see arguments that culture really is the core thing. But nevertheless, it seems to me that the rationality community is not excelling on the dimension of training it’s members in mental TAPs.
[Added note: Brienne’s Tortoise skills is nearly archetypal of what I mean by “mental habits”.]