Sunday, June 24, 2012

Irrationality as a byproduct of conditioning

So I've been thinking a bit more seriously about actively improving peoples rationality with my new skills.  Direct suggestions to not flinch away from problems, and not stop at mental stop signs have been quite effective when done in real time.

[EDIT: I no longer believe the main point of this post]

The right way to do stable change is to make them congruently want it. So for rationality, this means show them how to kill ugh-fields and stop signs, and then show them through and through why this is entirely good  and important for them to do.

It may seem like ugh fields and stop signs are but a small part of rationality (and in a sense they are), but it looks like they're the difference that makes the difference. Every instance of self deception I've come across is caused by an ugh-field, and its usually right at the surface. More analytically minded people like myself might have more layers of rationalization built on top, but the cause is the same. And it makes sense - why self deceive if you aren't running from something? I've never seen someone be comfortable with everything and then deliberately decide to self deceive because according to calculations, it's the best strategy - that would be clearly ridiculous!

If ugh fields motivate self deception, stop signs stop you from fixing them. It's often driven by the same instinctive flinch, but it can also be because they are just so used to hitting "I'm feeling lucky!" that they literally aren't aware that there are more results. "I can't" or "I don't know" is often the next answer.

Every freaking problem seems to boil down to this same shit. After you get them to shut up and listen, the message is the same: plow through your scary thoughts and don't stop at stop signs. After that, the rest is easy.

These problems aren't good for the genes but bad for alien parasite technical guy. They're bad for everyone, yet they exist. 

Ugh fields are a pretty straight forward consequence of classical conditioning. No selection pressure needed.

This brings us to a pretty important point: The machinery behind our self deception is an accidental byproduct of our brain architecture! It's a spandrel.  This has some interesting and important implications.

While we still have to worry about irrationality as a negative externality, and that's still gonna make our job more difficult, we can see that the mechanisms work even without genuine benefit to the individual. Maybe a large part of it is entirely purposeless, and easier to destroy?

Now, this doesn't destroy the  "conscious mind as public relations" theory -  There is certainly selection pressure making sure people are scared of admitting to antisocial things, but the machinery was already there. And the machinery isn't that smart.

It shouldn't be surprising if a smart person that knows how to mesh system 1 and system 2 cooperatively does better than someone taking the default conflict - and anecdotally this does seem to be the case. This all makes me worry much less about the valley of bad rationality.

This is great news! These driving mechanisms are simple. We know the enemy, and it isn't going to outsmart us.


  1. "We know the enemy and it isn't going to outsmart us".

    Be careful with that. You aren't the first and you probably won't be last to declare war on this enemy. Many, just as smart as you, have fallen before you.

    I personally think you're a long long way off, and that you have quite a few surprises coming your way.

    I can point to a few mistakes you're already making.

    1. Thinking that it's that simple.

    2. Thinking that you know the enemy.

    3. Thinking that it won't outsmart us.

    In essence, overconfidence.

    Now all the stuff you said are true. The issue is that they're also untrue. The problem is with language of course.

    There's another 50% to all this. And that's the hard part.


  2. So basically, you disagree with the main point.

    Can you give a one or two sentence summary of my reasoning so I can be sure you heard me right, and point out where it fails exactly?

  3. So... Um.... Joe was right.

    I take back the main point.