Okay, last post before I go back to work. Following up on the brain vs. mind post, here's Ben Goldacre talking about an interesting 2008 article measuring how much adding bad neuroscientific references make poor arguments sound more authoritiative:

set of experiments from the March 2008 edition of the Journal of Cognitive Neuroscience, which elegantly show that people will buy into bogus explanations much more readily when they are dressed up with a few technical words from the world of neuroscience. Subjects were given descriptions of various psychology phenomena, and then randomly offered one of four explanations for them: the explanations either contained neuroscience, or didn't; and they were either good explanations or bad ones (bad ones being, for example, simply circular restatements of the phenomenon itself).

Here is one of their scenarios. Experiments have shown that people are quite bad at estimating the knowledge of others: if we know the answer to a piece of trivia, we overestimate the extent to which other people will know that answer too. A "without neuroscience" explanation for this phenomenon was: "The researchers claim that this [overestimation] happens because subjects have trouble switching their point of view to consider what someone else might know, mistakenly projecting their own knowledge on to others." (This happened to be a "good" explanation.)

A "with neuroscience" explanation – and a cruddy one too – was this: "Brain scans indicate that this [overestimation] happens because of the frontal lobe brain circuitry known to be involved in self-knowledge. Subjects make more mistakes when they have to judge the knowledge of others. People are much better at judging what they themselves know." The neuroscience information is irrelevant to the logic of the explanation.

The subjects were from three groups: everyday people, neuroscience students, and neuroscience academics. All three groups judged good explanations as more satisfying than bad ones, but the subjects in the two non-expert groups judged that the explanations with logically irrelevant neurosciencey information were more satisfying than the explanations without. What's more, the bogus neuroscience information had a particularly strong effect on peoples' judgments of bad explanations. As quacks are well aware, adding scientific-sounding but conceptually uninformative information makes it harder to spot a dodgy explanation.

None of this will come as any surprise, but it's still nice to see it studied.

This is not to argue that neuroscientific research should be of no interest to people who want to improve their minds, but that equating brain and mind– and scientific knowledge with the far more complex reality of living your life and using your mind– and assuming that science can help you a lot may not yet be true.

For some reason I'm reminded of the line in Woody Allen's Manhattan, where someone laments, "It took me 18 years to have my first orgasm, and my gynecologist told me I'm having the wrong kind." Likewise, let's say an fMRI machine reveals you're using the "wrong" part of your brain when you do something you're really good at. Do you try to retrain your brain? Probably not.