Jump to content

Recommended Posts

Posted

An interesting new study reported in NPR:

 

http://www.npr.org/t...oryId=128490874

 

New research suggests that misinformed people rarely change their minds when presented with the facts -- and often become even more attached to their beliefs. The finding raises questions about a key principle of a strong democracy: that a well-informed electorate is best.

...

We'd like to believe that most of what we know is accurate and that if presented with facts to prove we're wrong, we would sheepishly accept the truth and change our views accordingly.

 

A new body of research out of the University of Michigan suggests that's not what happens, that we base our opinions on beliefs and when presented with contradictory facts, we adhere to our original belief even more strongly.

 

So perhaps Churchill's famous saying should be rewritten as "A lie gets halfway around the world before the truth gets its pants on, but the lie is wearing a rather nice minidress and showing lots of cleavage."

Posted

An interesting new study reported in NPR:

 

http://www.npr.org/t...oryId=128490874

 

 

 

So perhaps Churchill's famous saying should be rewritten as "A lie gets halfway around the world before the truth gets its pants on, but the lie is wearing a rather nice minidress and showing lots of cleavage."

 

I thought that was Mark Twain. I see on Google that a similar one is attributed to Churchill, which makes me want to believe it was Twain all the more.

Posted

Several variations on the theme exist, but this seems the oldest.

 

Jonathan Swift, Thursday 9 November 1710

Few lies carry the inventor's mark; and the most prostitute enemy to truth may spread a thousand without being known for the author. Besides, as the vilest writer has his readers, so the greatest liar has his believers; and it often happens, that if a lie be believed only for an hour, it has done its work, and there is no farther occasion for it. Falsehood flies, and Truth comes limping after it; so that when men come to be undeceived, it is too late, the jest is over, and the tale has had its effect: like a man who has thought of a good repartee, when the discourse is changed, or the company parted: or, like a physician who has found out an infallible medicine, after the patient is dead.
Posted (edited)

Facts can be hard to sort out. The unwashed masses, coming home from a hard day at the smelter, may be experiencing 'decision fatigue' and falling back on whatever *simple* answer is available. Or whatever belief they already hold. Sort of what the NPR intro says...but, to me, they make it sound like people are willfully choosing (faulty) 'beliefs' over 'facts'.

 

Very interesting article though, thanks.

I'm not sure what to think of it.

 

It seems to somehow tie in to postmodern philosophy for me.

Or at least the arm chair version I got from this book:

http://www.amazon.com/Reality-Isnt-What-Ready-Wear/dp/0062500171

I liked the book btw, weak chaired and arm sauced as it might be...

 

Anyways.

I'd really like to hear what else you all think about it.

If anyone is willing to risk public humiliation by actually stating an opinion of their own on here...

as opposed to other people's quotes.

 

:D

Edited by bbrubaker
Posted (edited)

Well, from what I have seen there are basically two different stances.

One is that they readily accept that it is way outside their expertise and basically accept what an expert on the field is saying (though with potential of subsequent revision once new information is available). Basically an approach also used to tackle a novel research question.

 

A second is trying to squeeze the information into a format that kind of makes it into their area of expertise, trimming where necessary. This allows them to evaluate it within a familiar context. This is more common in scientist that study very fundamental aspects of nature, e.g. for theoretical physicist or applied mathematicians. Of course, the farther away (and more complex the system) it is the more errors they make. From this group I have seen more examples of what is described in the op (due to selection of data).

Edited by CharonY
Posted

This discovery isn't really all that new. Not to say it isn't interesting, but social psychologists and economists have studied things like this for years. If I remember right it comes from something like a mixture of cognitive dissonance and confirmation bias.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.