Monday, July 2, 2012

Facets of rationality

So last time I talked about the fact that a large part of irrationality is caused by flinching away from unpleasant thoughts. This seems to cover the entirety of cases that make you want to throw up your arms and scream "You're being IRRATIONAL!!!"

However, if you define rationality as the skill of arriving at true beliefs, the skill of systematic winning, or as any other meta-intelligence like thing, then this isn't the whole story.

You might systematically get the wrong answer simply because you've never heard of Bayes' Theorem. You might make poor decisions because you don't know to take into account scope insensitivity. And if so, you're faring poorly on epistemically and instrumentally rationality - even without flinching from thoughts.

However, if there's nothing to flinch from and you have some minimal intelligence needed to bootstrap up, it seems likely that you'll eventually learn about Bayes' Theorem and scope insensitivity.

Having the emotional skills of being unconstrained by ugh fields seems to point you into the rationality attractor with much higher probability than someone with a few rationality techniques and the clothes of rationality.

Technical rationality is pretty cool stuff. I get a lot of mileage out of it, and I still giggle to myself every time I explicitly use Bayes' Theorem to make correct predictions on seemingly little evidence. However, there's a lot of shit to learn, and I'm not sure exactly how dense it is with good stuff - I certainly don't regret learning what I have, but I don't think I can recommend it all to everyone. 

I'm also not sure the relative value of being up to date on understanding the techniques of rationality compared to the value of simply having the habit of being unswayed by ugh-fields. The latter seems pretty damn important - even on the margin - even for people good at rationality. That's certainly where I see my own lowest hanging fruit.

Flinching seems to determine whether you can learn new techniques and use the ones you got - maybe us aspiring rationalists should shift our focus?


  1. Yes, yes, yes.

    Shift the focus!

    Don't go all out to the other side, in my experience a 50/50 split seems to be useful. And at this point for most over thinkers the non skills area os where the low hanging fruit is.


  2. Eh, I think Godel, Chuch, Turing and Tarski pretty much demonstrated that 'truth' has no meaning in formal logic. You can prove things, but you always run into the problem that you can be recursively asked to prove your proof. You could do that infinitely and still never arrive at 'truth'. And then there's the whole completeness vs consistency paradox, and the map vs territory distinction. That's why science tends to lean towards the empirical, and multiple repetitions in experiments.

    At the same time, I think humanity would be better off if people had more objective sense and less emotional inclination to believe in whatever superstition that helps them to rationalize their behavior/experience this week. I don't believe there's any mechanical formula that will consistently arrive at an accurate representation of reality in all circumstances, however. Instead, there's just a whole lot of meta.