Welcome to European Tribune. It's gone a bit quiet around here these days, but it's still going.
Display:
A part of it is probably that people who subscribe to defective pictures of the world are less able [.pdf] (via) to recognise when others are off base.

That would explain in no small part why people who hold weird ideas are attracted to other people who hold weird ideas, irrespective of their consistency: Neither is capable of telling that the other is spouting horse manure, and both are opposed by the same kinds of people; that is, pretty much everybody who isn't crazy. The enemy of my enemy is my friend - especially when nobody else wants to be...

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Fri Jan 23rd, 2009 at 02:22:12 PM EST
[ Parent ]
Certainly a common thread is that they are opposed to the "modernist" scientific world and empirical methods in general, but I think there may also be emotional and other neural and quasi logical links to the things they oppose...

notes from no w here
by Frank Schnittger (mail Frankschnittger at hot male dotty communists) on Fri Jan 23rd, 2009 at 03:11:32 PM EST
[ Parent ]
It may simply be because they start out holding one nonsense notion, which is sufficiently divorced from reality to impair their ability to evaluate the merits of claims in general. If such an individual meets with an organisation that - ah - does not stress critical thinking skills, shall we say, it's entirely possible that this person will adopt its dogma more or less wholesale. Certainly, crank magnetism appears even in more loosely associated groups (see, e.g. the cross-pollination between YEC'ers and germ theory "skeptics"), so there's no reason it shouldn't happen in a structured environment in which adoption of the entire dogma is actively encouraged.

And weird ideas don't come and go - they accumulate: Having no basis for sifting the nonsense from the sense, dismissing any part of the dogma will run the risk of alienating part of the group. And since consistency isn't terribly important to this kind of mindset, it's easier to simply insist on all the dogma that the group has happened to pick up along the decades.

That's my amateur psychology anyway...

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Fri Jan 23rd, 2009 at 04:05:25 PM EST
[ Parent ]
It's not quite that simple, unfortunately. Logic and critical thinking does not, by itself, ensure convergence of beliefs. You can in fact have two entirely logical and rational people who, when faced with the same evidence, and after incorporating this same evidence with all due logical and critical diligence, will be further from agreeing than they were before.

Here's an example: person B goes on TV to claim that a certain dictator H has nukes. Viewers U, V initially have no opinion on the matter, but U considers B a lying windbag who can't be trusted, while V considers him an honourable person deserving much respect.

After watching the broadcast, U believes (as he did before) that B has deliberately lied, so the opposite of his claim must be more likely, ie H probably has no nukes. However, V respects B (as he did before) and now believes that H probably has nukes.

Logic on its own is not good enough for convergence of beliefs. You have to take into account the full set of prior beliefs of a person, and this is why it's a waste of time to talk seriously to religious people or conspiracy nutters to question their delusions, for example.

Scientists tend to believe that logic and critical thinking alone brings convergence, but that is only so because they all have highly compatible prior beliefs on everything that matters in their work, obtained from highly similar educational backgrounds.

--
$E(X_t|F_s) = X_s,\quad t > s$

by martingale on Fri Jan 23rd, 2009 at 07:52:23 PM EST
[ Parent ]
But neither of those is an example of rational thinking.

I didn't believe that H had nukes because a workable WMD program is almost crippingly expensive, even for a functional economy, and after GW I Country I certainly didn't have a functional economy.

Also, B clearly was a lying windbag too. But that was a suggestive data point, not proof.

Scientists believe critical thinking brings convergence because beliefs are independently checked and verified and supported by abstract reasoning. Every so often this goes horribly wrong, but the system works - within its limits - at least as well as any other kind of intellectual attempt to understand the world.

Religious people and paranoid exploding-earth nutters are swayed almost entirely by their emotions and don't do logic at all. 'I feel it' isn't much of an explanation for anything, which is why you can't argue with it rationally. Meanwhile the paranoid nutters enjoy drama and fear for the sake of it.

The broad split is between people who pay attention to facts outside themselves and people who only pay attention to their feelings. The Pope seems to be one of the latter - he'd rather defend an irrational faith with irrational acts than engage with historical reality.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Jan 23rd, 2009 at 08:28:23 PM EST
[ Parent ]
Rational thinking is a method of processing facts and assumptions. It can be applied to as little or as many facts as one chooses, and as little or as many assumptions as one chooses. So my examples are certainly examples of rational thinking, albeit with a highly reduced set of assumptions and facts chosen to illustrate the point.

Also, B clearly was a lying windbag too. But that was a suggestive data point, not proof.
It most certainly was a proof:
Assumption: B lies.
Fact: B claims H has nukes
Conclusion: B's claim is not true

You cannot claim that the above isn't rational just because you disagree with the limited scope of the universe of discourse in that example. In your own example, the fundamental possibility of divergence remains:

I didn't believe that H had nukes because a workable WMD program is almost crippingly expensive, even for a functional economy, and after GW I Country I certainly didn't have a functional economy.
A rational person can still disagree with you by disagreeing with some of your unstated underlying assumptions, such as e.g. that H had actually had them built fully rather than only partially say, or that he had them smuggled in, or that a workable program is much less expensive than you believe etc.

Unless you happen to be a WMD scientist, all of your assumptions about H and his country were derived from interpreting media reports available to you, together with the meta-assumption that these reports were not all outright lies and misinformation. A conspiracy theorist could read the same media as you, but because he assumes that they are outright lies, he will end up with a rather different interpretation, yet both of you would be exposed to the exact same facts in writng and both of you would be rational, you only differ in a highly influential assumption.

Religious people and paranoid exploding-earth nutters are swayed almost entirely by their emotions and don't do logic at all. 'I feel it' isn't much of an explanation for anything, which is why you can't argue with it rationally. Meanwhile the paranoid nutters enjoy drama and fear for the sake of it.
I'm not saying that all religious people and conspiracy people *are* behaving rationally, I'm saying that even if they were, it would be insufficient to overcome the boundaries of their world view. Thus it is meaningless to blame their differing views on a failure to think rationally or to process facts.

Two people who are entirely rational but whose prior assumptions differ can both interpret common facts logically, and end up disagreeing even more afterwards.

--
$E(X_t|F_s) = X_s,\quad t > s$

by martingale on Fri Jan 23rd, 2009 at 10:05:43 PM EST
[ Parent ]
There are some physical constraints on human behaviour. You don't have to be an expert in order to give a ballpark guesstimate. E.g.: The only countries with functioning WMD programmes are countries with a GDP more than three times that of the country in question - and if you count only the ones that didn't get help from the USA or one of its client states, call it a factor of 30 to 300 instead, depending a little on your definition of "client state."

The conspiracy theorist is deficient in critical thinking skills if he assumes that all media reports are outright lies. It violates Occam's Razor, which is a pretty basic tool for critical thinking. And lying about everything is plain stupid. You only lie about the important things, because the more you have to lie, the easier it is to slip up and build in an inconsistency that's a little too glaring.

Reasonable people arguing in good faith can, and frequently do, reach widely divergent conclusions. But there are some constraints on what kind of conclusions they can reach. And most cults like the one under discussion are clearly on the "divorced from reality" side of that line.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Sat Jan 24th, 2009 at 03:33:09 AM EST
[ Parent ]
My answer here is essentially the same as in my other comment, so I'll keep it short. You claim that the USA didn't help this country, say, but you have only the USA's word for this, which is unreliable to some.

Occam's Razor here reduces to a prior belief that the USA generally tells the truth, versus no such belief. Think of an American, a European, a Russian, and a Chinese.

--
$E(X_t|F_s) = X_s,\quad t > s$

by martingale on Sat Jan 24th, 2009 at 05:01:51 AM EST
[ Parent ]
But at some point the ad hoc assumptions get a shade too convoluted to pass the smell test.

Sure, you can assume that the US government lied about not helping, but is now telling the truth about the existence of the nukes. But this seems to be a contradiction: The US government is full of shit when it makes a self-serving statement about not helping nasty people get nukes. But it's a model of honesty when it makes a self-serving statement about nasty people having nukes.

You could then elaborate the assumption by noting the change in management in the US inbetween those statements. But this can be challenged by noting that if this new, more truthful management actually was serious about the whole truth thing, they could just release the documentation proving that the previous management had aided the nasty people. Then the previous management would have egg on its face and the case would be open-and-shut.

Of course, it's possible to elaborate the ad hoc assumption further with another ad hoc modification. But there is a limit to how many ad hoc assumptions you're permitted to stack on top of each other before you've left the realm of logic and reason and entered the realm of narratives. It's not a hard limit by any means, but it is there somewhere, and conspiracy theorists usually sail right past it within the first two or three paragraphs...

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Wed Jan 28th, 2009 at 11:24:46 AM EST
[ Parent ]
martingale:
It most certainly was a proof:
Assumption: B lies.
Fact: B claims H has nukes
Conclusion: B's claim is not true

Uh, no. This only follows logically if B lies all the time. In 2003 that was looking likely, but not certain, and certainly not proof of anything.

It turned out in retrospect that he lied maybe 95% of the time. But that fact wasn't available in 2003.

A rational person can still disagree with you by disagreeing with some of your unstated underlying assumptions, such as e.g. that H had actually had them built fully rather than only partially say, or that he had them smuggled in, or that a workable program is much less expensive than you believe etc.

But there was no absolutely evidence to support any of those claims.

You're sounding like the people who said that Saddam really did have WMDs but... they were smuggled to Syria, which is why they were never found.

There's a vast uncrossable gulf between that kind of narrative logic, in which anything goes as long as it sounds vaguely plausible, and evidence-based argument, which requires a decent data set to argue implications from.

Unless you happen to be a WMD scientist, all of your assumptions about H and his country were derived from interpreting media reports available to you, together with the meta-assumption that these reports were not all outright lies and misinformation.

That and reading books and comments by the UN weapons inspectors, who might reasonably be expected to have a more accurate picture than the media.

In fact the media were spectacularly wrong and generally supportive of the party line, so there was no meta-assumption needed.

I assumed the primary sources - which were freely available to anyone - were more accurate than the media reporting.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Sun Jan 25th, 2009 at 07:04:01 AM EST
[ Parent ]
Uh, no. This only follows logically if B lies all the time.
There's no "all the time" in the example, only person U who assumes that B lies on TV. Your ojbection has merit of course, but only if you extend the scope of the original example hugely.

It turned out in retrospect that he lied maybe 95% of the time. But that fact wasn't available in 2003.
Actually, it was pretty obvious at the time that he lied, which is why the whole thing was such a hard sell. B's statements were repeatedly contradicted by the IAEA investigators on the ground in UN reports, his claims were not confirmed by any major powers (France, Russia, China) except for the US, his dossier was immediately shown to be lifted (and edited) from internet sources, his claims agreed with Powell's UN lies, which were also contradicted by the IAEA at the time, and of course most of his claims were contradicted by the Iraqi government at the time.

The only way anybody at the time could claim that it wasn't obvious that the UK government was lying was by weighting the USUK statements 99%, and all other statements 1%, say. That's actually quite reasonable for British people in general to do, on the grounds that they'd have to become paranoid otherwise, but non-anglophones had no such conflict of interest.

Which nicely again illustrates my point about unstated assumptions leading to divergence. Anybody who placed even 50% weight on USUK statements and 50% weight on statements from other sources essentially had to consider B a liar, simply due to the large number of contradicting claims of fact by other independent sources.

There's a vast uncrossable gulf between that kind of narrative logic, in which anything goes as long as it sounds vaguely plausible, and evidence-based argument, which requires a decent data set to argue implications from.
You seem to think that evidence-based argument (as opposed to calling something narrative logic?) requires a specific set of common initial assumptions (such as the famous I think therefore I am, which alone is obviously insufficient). Please list them.

--
$E(X_t|F_s) = X_s,\quad t > s$
by martingale on Sun Jan 25th, 2009 at 11:38:01 PM EST
[ Parent ]
But obviously, person B either is a lying windbag or he isn't. OK, "windbag" may be in the eye of the beholder, but whether he has a history of lying or not is a matter of public record. So one of the two viewers suffers from either a) ignorance of the public record, or b) a lack of ability to discriminate between lying windbags and honest brokers in the debate.

It's true that it is possible to start out from different assumptions and, using perfectly valid logical syntax, reach widely diverging conclusions. But for that to qualify as reasonable, the assumptions have to be not too divorced from reality.

The key distinction here is whether this divorce from reality is caused by lack of information or by lack of critical thinking skills. The former is a lot easier to cure than the latter.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Sat Jan 24th, 2009 at 03:24:25 AM EST
[ Parent ]
but whether he has a history of lying or not is a matter of public record.
But what if the public record is not considered reliable, and is often contradictory on the topic? That is par for the course with conspiracists, which is why pointing to the facts on public record has no power to change their minds.

It's true that it is possible to start out from different assumptions and, using perfectly valid logical syntax, reach widely diverging conclusions. But for that to qualify as reasonable, the assumptions have to be not too divorced from reality.
Reality is what one can touch and see (etc.) One does not touch or see the facts reported in the public record, one only touches or sees what is _written_ (etc) in the public record, and what is written in reports referring to the public record (etc).

There is no divorce from reality as such in any case. A (hypothetically rational) conspiracist accepts what is written in the public record, thus accepting reality (so far, just like you or I), but does not infer (unlike you or I) that the facts referred in the public record are generally true events.

This is not out of lack of logic (again take a hypothetical rational conspiracist) but out of a working assumption that the record is unreliable or deliberately misinformation. Nothing in the public record contradicts the working assumption (how could it?), therefore this assumption is not revised.

--
$E(X_t|F_s) = X_s,\quad t > s$

by martingale on Sat Jan 24th, 2009 at 04:52:06 AM EST
[ Parent ]
It still falls to Occam's Razor. While not a part of formal logic per se, it certainly is a part of what I'd consider a rational and reasonable mindset.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Sat Jan 24th, 2009 at 04:54:59 AM EST
[ Parent ]
The question I find intriguing is not the logicality or otherwise of a set of beliefs, but their functionality.  Why is it in someones interests to believe (say) that science is all a conspiracy.  I can see why climate change deniers don't want to have to give up their SUVs, but belief systems are ,ore difficult to explain on those terms.

The importance of wanting to beling to a tightly knit group - ideally contra-defined to a hostile world will obviously appeal to a paranoid mindset, but it doesn't explian the content of those beliefs that the group hold dear.

notes from no w here

by Frank Schnittger (mail Frankschnittger at hot male dotty communists) on Sat Jan 24th, 2009 at 05:35:27 AM EST
[ Parent ]
sorry other belief systems

notes from no w here
by Frank Schnittger (mail Frankschnittger at hot male dotty communists) on Sat Jan 24th, 2009 at 05:37:20 AM EST
[ Parent ]
*That* is a good question. I doubt there are universal answers, and you've already said that much anyway.

For the case of denying science, I think that people sometimes get carried away. Science gives absolute answers, but only on a highly restricted set of questions. There is a discipline in not answering questions whose answer is unknown, and by extension, not asking questions whose answer is expected to be unobtainable. Many people cannot or won't accept this discipline, and prefer to complete their knowledge on the "big" questions with beliefs rather than leave some questions unanswered.

Which leaves a fascinating ancillary problem: where do the "big" questions come from and why won't they go away? I suspect that kids don't come up with these questions on their own, but rather absorb them and their "importance" from contact with adults, which leads to pressure to resolve them.

--
$E(X_t|F_s) = X_s,\quad t > s$

by martingale on Sat Jan 24th, 2009 at 06:06:57 AM EST
[ Parent ]
Frank Schnittger:
 Why is it in someones interests to believe (say) that science is all a conspiracy.

Firstly, they enjoy the drama. Worrying that the world is going to end makes life more exciting than the day job.

Secondly it 'proves' that they're not really as stupid and powerless as science makes them feel.

Also, it's very rare for hardcore CT followers to be even slightly literate in basic science. Facts and paranoia look indistinguishable to them, because they don't have the background to tell them apart.

See this thread for a depressing example.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Sun Jan 25th, 2009 at 07:10:54 AM EST
[ Parent ]
My point is that the "content" - such as it is - may very well be amplified noise. Reasonable hypothesises come and go. Neuroses accumulate.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Wed Jan 28th, 2009 at 11:27:02 AM EST
[ Parent ]

Display:

Occasional Series