Tuesday, June 5, 2012

Rational thinking is not natural

"Rational thinking is not natural" may seem like quite an odd and stupid title, but let me explain. (No, it's not an attack on rational thinking. Again, please let me explain.)

Irrationality, superstition and anti-scientific sentiments seem to be on the raise (again) in later years. The irrational has always fascinated people for the entire history of humanity. There has always been (and unfortunately there probably will always be) a very significant portion of humanity that wants to believe in all kinds of superstition, bypassing all rational thinking. However, irrational thinking has experienced a huge boost recently. Not only is young-earth creationism and anti-evolutionism on the raise, but a general anti-science mentality among other groups as well (such as many nature activists).

To these people, somehow, the very idea of requiring actual observation, measurements and testing before believing something seems silly and limited. Not only is rational thinking discarded, it's actually ridiculed by many (often in a quite hypocritical manner, as they often project their own attitudes towards the rational thinkers.)

But why are people so inclined to believing in irrational things? Why such naïveté? Why is rational thinking so hard?

I believe that the answer lies in evolution. As counter-intuitive as it might sound at first, in the distant past there was probably a survival advantage to easily believing what you were told. If a parent told their child "don't eat this plant, it's dangerous" or "if you see this animal, run away fast", that child probably had a better chance of surviving to adulthood if he or she believed the warning without question. Children who did not believe what they were told had a higher probability of dying. Most instincts are demonstrably inherited traits (especially in the long run, when naturally selected over large populations over a large amount of time), and the instinct to believe what you are told is most probably one of them.

Do you wonder anymore why people are so eager to believe claims like "vaccines are dangerous" and "the government is out to get you"? It's the natural instinct of a human to believe such claims, even if there is no evidence. (Naturally "evidence" is later fabricated to explain away all objections to the claim. This purported "evidence" is, of course, carefully selected using high amounts of bias, cherry picking and misinterpretation.)

Rational thinking is not something that a person does by instinct. We are intelligent, thinking beings, but we have no propensity to instinctively think in a rational manner. (Basically the only kind of "rational thinking" that humans do instinctively is the kind that increases chances of survival or well-being.)

Rational thinking is something that has to be learned. In the vast majority of cases, proper rational thinking requires decades of study and education in hard sciences and the scientific method, as well as studying all the ways that the human brain can fool itself.

Saturday, June 2, 2012

Cult mentality in the era of the internet

One of the key techniques that cults, and even many more open religions, use to keep their followers is to discourage, if not even outright forbid, communication with the "opposition".

If your intention is to keep a flock of followers who agree with your views, this technique makes sense and is pretty effective. If your followers never communicate with people with differing views and never hear and have to think about criticism of their own views, and are never exposed to alternatives, that's an effective way to keep people believing in your views. Sometimes some of the people, even when isolated from alternative views and criticism, will start thinking about it on their own and doubting the veracity of the views they have been believing. However, this is relatively rare (and usually other techniques are used to discourage, mitigate and outright destroy such doubts.)

But this post is not about religion. It just served as an introduction.

What I find curious is that one could easily think that this kind of thing is next to impossible on the internet, yet it still happens. After all, if you surf the internet, and you are interested in a certain topic that may have more than one point of view, it's almost impossible to not to stumble across criticism and alternative views. It's impossible to stop people from finding out about differing opinions and their arguments. One would think that this would effectively destroy any kind of biased "cult mentality". However, it doesn't.

What's happening is that people are limiting themselves to reading and believing only what they want, and to reject any criticism. You don't even need to physically restrict people from receiving outside views, nor to even actively discourage people from doing so: They will do it themselves, even without any kind of prompting.

You see this all the time on the internet. Whenever there's a topic that can be even the slightest bit controversial (it really doesn't matter what it is), there will always be fans and followers who will rabidly defend one side or the other, and willingly ignore and reject any criticism (regardless of how valid that criticism might be.) This is especially helped if there exist some charismatic "leaders", public figures who advocate a position on the subject, and who are good and charismatic speakers or writers. This kind of person or persons can have a great influence on people who would otherwise be more open to alternatives.

These "leaders" don't even need to actively discourage their followers from exposing themselves to criticism. They can (and often) use must subtler approaches. One extremely common technique is to ridicule the criticism and opposing views. (This technique is called "poisoning the well".) It doesn't even have to be overt ridicule. Just small hints that the opposing view is ridiculous is enough. This will very easily introduce a strong pre-emptive bias into the followers, who will then be very eager to reject the criticism, even without actually reading it. (And even if they do read it, they will be predisposed to rejecting and ridiculing it. Thus they will be reading it through a kind of "mental filter" which concentrates on any perceived flaws, no matter how small, and rejects anything that could be even slightly valid.)

The sad thing is that this is often not done only by fringe extremists who support a destructive ideology, but unfortunately it's often also done by many people who support a truly worthy cause. The negative side effect of this is that they and their followers will start to fight the right cause using the wrong methods, often causing more harm than good. In other words, while the cause is good, the methodology used by them will only make things worse. Most typically, such well-intentioned extremists fighting for a good cause will only cause even more animosity and hatred between the differing camps than a more calm and amicable approach would. This kind of extremism will often drive people away, causing even more discord among the groups than there already was.

It is seldom the case that the extreme view on the subject is absolutely correct, and that any opposing view is absolutely wrong. The "poisoning the well" methodology, however, often makes people believe that this is so, and hence they will unjustly reject and ridicule even opposing arguments that are actually right, as well as furiously defend some of the views of their own camp that can be considered dubious.

The end result of this is that no actual rational discussion is possible between the different camps. Any attempt at discussion will immediately be shut down by hostility and ridicule.

It really is curious how such strong bias and "cult mentality" is possible even on the internet era, where there are no barriers for people to get all the information they need to make a rational, informed decision on things.