The European Tribune is a forum for thoughtful dialogue of European and international issues. You are invited to post comments and your own articles.
Please REGISTER to post.
That would explain in no small part why people who hold weird ideas are attracted to other people who hold weird ideas, irrespective of their consistency: Neither is capable of telling that the other is spouting horse manure, and both are opposed by the same kinds of people; that is, pretty much everybody who isn't crazy. The enemy of my enemy is my friend - especially when nobody else wants to be...
- Jake Friends come and go. Enemies accumulate.
And weird ideas don't come and go - they accumulate: Having no basis for sifting the nonsense from the sense, dismissing any part of the dogma will run the risk of alienating part of the group. And since consistency isn't terribly important to this kind of mindset, it's easier to simply insist on all the dogma that the group has happened to pick up along the decades.
That's my amateur psychology anyway...
Here's an example: person B goes on TV to claim that a certain dictator H has nukes. Viewers U, V initially have no opinion on the matter, but U considers B a lying windbag who can't be trusted, while V considers him an honourable person deserving much respect.
After watching the broadcast, U believes (as he did before) that B has deliberately lied, so the opposite of his claim must be more likely, ie H probably has no nukes. However, V respects B (as he did before) and now believes that H probably has nukes.
Logic on its own is not good enough for convergence of beliefs. You have to take into account the full set of prior beliefs of a person, and this is why it's a waste of time to talk seriously to religious people or conspiracy nutters to question their delusions, for example.
Scientists tend to believe that logic and critical thinking alone brings convergence, but that is only so because they all have highly compatible prior beliefs on everything that matters in their work, obtained from highly similar educational backgrounds. -- $E(X_t|F_s) = X_s,\quad t > s$
I didn't believe that H had nukes because a workable WMD program is almost crippingly expensive, even for a functional economy, and after GW I Country I certainly didn't have a functional economy.
Also, B clearly was a lying windbag too. But that was a suggestive data point, not proof.
Scientists believe critical thinking brings convergence because beliefs are independently checked and verified and supported by abstract reasoning. Every so often this goes horribly wrong, but the system works - within its limits - at least as well as any other kind of intellectual attempt to understand the world.
Religious people and paranoid exploding-earth nutters are swayed almost entirely by their emotions and don't do logic at all. 'I feel it' isn't much of an explanation for anything, which is why you can't argue with it rationally. Meanwhile the paranoid nutters enjoy drama and fear for the sake of it.
The broad split is between people who pay attention to facts outside themselves and people who only pay attention to their feelings. The Pope seems to be one of the latter - he'd rather defend an irrational faith with irrational acts than engage with historical reality.
You cannot claim that the above isn't rational just because you disagree with the limited scope of the universe of discourse in that example. In your own example, the fundamental possibility of divergence remains:
Unless you happen to be a WMD scientist, all of your assumptions about H and his country were derived from interpreting media reports available to you, together with the meta-assumption that these reports were not all outright lies and misinformation. A conspiracy theorist could read the same media as you, but because he assumes that they are outright lies, he will end up with a rather different interpretation, yet both of you would be exposed to the exact same facts in writng and both of you would be rational, you only differ in a highly influential assumption.
Two people who are entirely rational but whose prior assumptions differ can both interpret common facts logically, and end up disagreeing even more afterwards. -- $E(X_t|F_s) = X_s,\quad t > s$
The conspiracy theorist is deficient in critical thinking skills if he assumes that all media reports are outright lies. It violates Occam's Razor, which is a pretty basic tool for critical thinking. And lying about everything is plain stupid. You only lie about the important things, because the more you have to lie, the easier it is to slip up and build in an inconsistency that's a little too glaring.
Reasonable people arguing in good faith can, and frequently do, reach widely divergent conclusions. But there are some constraints on what kind of conclusions they can reach. And most cults like the one under discussion are clearly on the "divorced from reality" side of that line.
Occam's Razor here reduces to a prior belief that the USA generally tells the truth, versus no such belief. Think of an American, a European, a Russian, and a Chinese. -- $E(X_t|F_s) = X_s,\quad t > s$
Sure, you can assume that the US government lied about not helping, but is now telling the truth about the existence of the nukes. But this seems to be a contradiction: The US government is full of shit when it makes a self-serving statement about not helping nasty people get nukes. But it's a model of honesty when it makes a self-serving statement about nasty people having nukes.
You could then elaborate the assumption by noting the change in management in the US inbetween those statements. But this can be challenged by noting that if this new, more truthful management actually was serious about the whole truth thing, they could just release the documentation proving that the previous management had aided the nasty people. Then the previous management would have egg on its face and the case would be open-and-shut.
Of course, it's possible to elaborate the ad hoc assumption further with another ad hoc modification. But there is a limit to how many ad hoc assumptions you're permitted to stack on top of each other before you've left the realm of logic and reason and entered the realm of narratives. It's not a hard limit by any means, but it is there somewhere, and conspiracy theorists usually sail right past it within the first two or three paragraphs...
It most certainly was a proof: Assumption: B lies. Fact: B claims H has nukes Conclusion: B's claim is not true
Uh, no. This only follows logically if B lies all the time. In 2003 that was looking likely, but not certain, and certainly not proof of anything.
It turned out in retrospect that he lied maybe 95% of the time. But that fact wasn't available in 2003.
A rational person can still disagree with you by disagreeing with some of your unstated underlying assumptions, such as e.g. that H had actually had them built fully rather than only partially say, or that he had them smuggled in, or that a workable program is much less expensive than you believe etc.
But there was no absolutely evidence to support any of those claims.
You're sounding like the people who said that Saddam really did have WMDs but... they were smuggled to Syria, which is why they were never found.
There's a vast uncrossable gulf between that kind of narrative logic, in which anything goes as long as it sounds vaguely plausible, and evidence-based argument, which requires a decent data set to argue implications from.
Unless you happen to be a WMD scientist, all of your assumptions about H and his country were derived from interpreting media reports available to you, together with the meta-assumption that these reports were not all outright lies and misinformation.
That and reading books and comments by the UN weapons inspectors, who might reasonably be expected to have a more accurate picture than the media.
In fact the media were spectacularly wrong and generally supportive of the party line, so there was no meta-assumption needed.
I assumed the primary sources - which were freely available to anyone - were more accurate than the media reporting.
Uh, no. This only follows logically if B lies all the time.
The only way anybody at the time could claim that it wasn't obvious that the UK government was lying was by weighting the USUK statements 99%, and all other statements 1%, say. That's actually quite reasonable for British people in general to do, on the grounds that they'd have to become paranoid otherwise, but non-anglophones had no such conflict of interest.
Which nicely again illustrates my point about unstated assumptions leading to divergence. Anybody who placed even 50% weight on USUK statements and 50% weight on statements from other sources essentially had to consider B a liar, simply due to the large number of contradicting claims of fact by other independent sources.
It's true that it is possible to start out from different assumptions and, using perfectly valid logical syntax, reach widely diverging conclusions. But for that to qualify as reasonable, the assumptions have to be not too divorced from reality.
The key distinction here is whether this divorce from reality is caused by lack of information or by lack of critical thinking skills. The former is a lot easier to cure than the latter.
but whether he has a history of lying or not is a matter of public record.
There is no divorce from reality as such in any case. A (hypothetically rational) conspiracist accepts what is written in the public record, thus accepting reality (so far, just like you or I), but does not infer (unlike you or I) that the facts referred in the public record are generally true events.
This is not out of lack of logic (again take a hypothetical rational conspiracist) but out of a working assumption that the record is unreliable or deliberately misinformation. Nothing in the public record contradicts the working assumption (how could it?), therefore this assumption is not revised. -- $E(X_t|F_s) = X_s,\quad t > s$
The importance of wanting to beling to a tightly knit group - ideally contra-defined to a hostile world will obviously appeal to a paranoid mindset, but it doesn't explian the content of those beliefs that the group hold dear. notes from no w here
For the case of denying science, I think that people sometimes get carried away. Science gives absolute answers, but only on a highly restricted set of questions. There is a discipline in not answering questions whose answer is unknown, and by extension, not asking questions whose answer is expected to be unobtainable. Many people cannot or won't accept this discipline, and prefer to complete their knowledge on the "big" questions with beliefs rather than leave some questions unanswered.
Which leaves a fascinating ancillary problem: where do the "big" questions come from and why won't they go away? I suspect that kids don't come up with these questions on their own, but rather absorb them and their "importance" from contact with adults, which leads to pressure to resolve them. -- $E(X_t|F_s) = X_s,\quad t > s$
Why is it in someones interests to believe (say) that science is all a conspiracy.
Firstly, they enjoy the drama. Worrying that the world is going to end makes life more exciting than the day job.
Secondly it 'proves' that they're not really as stupid and powerless as science makes them feel.
Also, it's very rare for hardcore CT followers to be even slightly literate in basic science. Facts and paranoia look indistinguishable to them, because they don't have the background to tell them apart.
See this thread for a depressing example.
by Frank Schnittger - Dec 3 2 comments
by Frank Schnittger - Dec 2 2 comments
by gmoke - Nov 28
by Frank Schnittger - Nov 21 10 comments
by gmoke - Nov 12 6 comments
by gmoke - Nov 8
by Oui - Dec 5
by Frank Schnittger - Dec 32 comments
by Oui - Dec 25 comments
by Frank Schnittger - Dec 22 comments
by Oui - Dec 26 comments
by Oui - Dec 111 comments
by Oui - Dec 14 comments
by Oui - Nov 305 comments
by Oui - Nov 289 comments
by Oui - Nov 276 comments
by gmoke - Nov 26
by Oui - Nov 268 comments
by Oui - Nov 26
by Oui - Nov 2513 comments
by Oui - Nov 2318 comments
by Oui - Nov 22
by Oui - Nov 222 comments
by Frank Schnittger - Nov 2110 comments
by Oui - Nov 2120 comments