Although I very very briefly touched on the first part of that statement when I talked about processing other people's ideas, I've never qualified what I mean by the second part, so let's talk about that.
People aren't, generally speaking, stupid, or at the very least, people aren't stupid in a way we think of as stupid. One of the easiest habits in the world to get into is to think of yourself as fundamentally separate from other people. There are times when the self-propagation of a belief like that is useful, but only in specific contexts and under specific restrictions (which I won't go into here).
I'm doing a poor job of it, but let me break this down a bit more. The statistical likelihood that I am by my own pure virtue smarter than the vast majority of people that I meet is low. Not impossible, but not likely, and hard to prove in any case. This is further complicated by knowing that a good deal of other people regard themselves in the same light as I do, so if I by de facto believe myself to be higher than others, I have no way of knowing whether I am right or whether I am insane like everyone else. In order for me to be right, I must assume that it is entirely possible for others to be completely deluded by pride, and by extension, the same possibility applies to me. Your claim of separation from the masses may very well be true, but it's almost impossible for you or me to verify, since the same faculties we use in our defense are the ones we're calling into question.
In my experience real life rarely deals in binary, so the more likely scenario is that I am above average in some areas and below average in others, and that the people around me are above average in some areas and below average in others, and that self delusion and self acknowledgement are mixed together into different areas of my life and often run in tandem.
I realize that this is not exactly helpful.
But, it does has some rather interesting implications to the way I should regard other people. Overwhelmingly, I think of myself as purely rational; my beliefs do not stem from character flaws. Yet overwhelmingly, I believe that other people's beliefs do stem from character flaws, and that the distinction between different systems of morality, philosophy, and even mild opinion are not arbitrary or experientially driven, but actually tied to an individual's distinctive worth.
However, if this is true and if I then notice many people around me (incorrectly) share the same opinion, we must take the stance that a vast majority of the population is literally incapable of judging their own intelligence, and that I am an exception simply by the circular argument of me somehow being, as C.S. Lewis put it, "the complete, balanced, complex man who sees round them all." Since very few of us are hard-core relativists in practice; we're left with the problem of knowing that there must be an optimal solution to debates but not knowing how to make sure that our 'obvious' answers are actually obvious and not just the results of our own pride and willingness to oversimplify other beliefs and scenarios.
One suggestion I have is that we can take away any advantage that could be a direct result of bias and conformity. Start with the viewpoint that other people have legitimate beliefs and work from there and it will be easier for you to gain the correct perspective on the situation.
And to help us all out with that I've come up with a set of heuristics we can use.
1. To truly engage any other person, you need to understand 'why' they believe the way that they do.
And again, "they're stupid" is a bad answer to that question. If you can't break down a person's psychology into actual working parts, or at the very least figure out where they're broken and why, the most likely scenario is that you don't actually know what's going on.
2. The likelihood of you holding a poorly thought-out set of ideas is directly correlational to the amount your worldview depends on a significant portion of the population being intrinsically less educated, open-minded, or moral than yourself.
It's certainly not a catch-all, but what I sometimes find helpful in these situations is to try and project myself into the other person. The old maxim, "walk a mile in their shoes" is, like many other virtues, grossly easy to misunderstand and misapply, but it's still a brilliant methodology for understanding conflict.
Think of it this way: If in any conflict I can figure out what process I would need to go through to view the world through the alternate lens, I've successfully deconstructed the opposing viewpoint, or at least have a plausible theory and am much closer to doing so. The less I have to alter myself and my beliefs to switch over to the other side, the better. Analogies and comparisons are helpful here; find some beliefs of your own that are similar in structure to what you're analyzing and try and switch out some variables. The trick is to get into your opponents shoes while still maintaining the majority of your toes.
Of course, all of this is an approximation at best; if you're looking for a surefire way to make sure that your beliefs are well-informed and not biased by perspective... I don't have one. It's horribly tricky. But if all else fails one easy way to push the odds in your favor is that -
3. You are significantly more likely to have a valid understanding of another viewpoint if at some point you have been seriously tempted to adopt it.
Of course this isn't something that you should be aiming for when you evaluate other beliefs; you should only be interested in whether or not the alternate viewpoints are actually true. But it can be an accurate indicator that you've done your homework.
In a lot of ways I think of the Fundamental Attribution Error as being very closely related to all of this, but there's something just ever-so slightly deeper going on when we dismiss the actual views of others as being irrelevant to our own. Seeing that a large number of people hold to a belief is both a valid and valuable observation, and as such it needs to be incorporated into our interpretation of that belief. We need to seriously consider other's viewpoints as having some potential to be valid until we can come up with a good reason for those people to believe what they do.
What happens more often though is that we tend to build straw-men that are isolated from reality and then project whatever characteristics are necessary to override the opposing viewpoint. If you find yourself telling other people in disagreements that their beliefs are fine because they're "not like the typical..." this should be cause for concern, because you've just observed that the belief patterns in question aren't restricted to a specific demographic. Your friends should be typical; if they're not then you've just seen a good argument that your idea of a typical belief is wrong.
Furthermore, non-predictive models aren't models. If you can't adequately claim without reasonable exception that, for instance, all atheists are willfully ignorant, or that all people who oppose homosexuality are bigots, then your explanation for their behaviors and viewpoints is incomplete, and you need to start considering that you might not actually understand what you're arguing against.
Even more insidiously, if your claim that all atheists are willfully ignorant or that all people who oppose homosexuality are bigots doesn't allow you to make accurate predictions about their behavior in other areas of life, you may simply be arguing by definition in the first place: "Anyone who believes 'x' is obviously 'y', regardless of how they might otherwise appear. When asked why someone would believe 'x' in the first place, we can reply that this is because they are 'y'."

No comments:
Post a Comment