Friday, September 10, 2004

Bad information is relative (or Congress is stupid)

The subject of this post comes from Robert Park's book Voodoo Science. It's a short and fun book about how bad science can be and the effects this bad science has on all of us. You're hopefully familiar with the story of cold fusion, so I thought I'd try this story.
Joe Newman invented an "energy machine," a device that transformed its own matter into energy. In short, he was claiming that the 500 pound machine was a perpetual motion machine. The nice folks at the patent office rejected his patent application based on their long standing rule of refusing perpetual motion machine applications unless a working model ran for a year at their office. Of course, perpetual motion machines can't exist thanks to the second law of thermodynamics (entropy), but let's ignore physics for now.
Newman had a very flattering piece about his work aired on CBS in 1984, naming him "the man who stumped the scientists" or something similar. Following this and his failed patent application, Joe turned to his state senators who then called a Congressional hearing to sort out why he can't have a patent for the device. Newman was doing fine until John Glenn -- yes, the former astronaut and Senator from Ohio at the time -- asked him this question:
"It's a simple enough problem, you measure the input and you measure the output and you see which is larger. Would Mr. Newman agree to that? If he does, what laboratory would he like to have make the measurements?"
Newman said no, he wouldn't like the test. For any scientist or person of reasonable logical capacity, this would probably be enough to disprove Newman. However, this is a Congressional hearing and not a panel of scientists, and the testimony continued. Fortunately, the hearings eventually revealed that the special master appointed by the patent office to review Newman's case was formerly his patent attorney. The hearings quickly dissolved after that.
We can learn several lessons from this example. First, Congress might not know much about science, but they know a conflict of interests when they see it. In the context of our class, bad information is relative. Saying "perpetual motion machine" is enough to convince any scientist that this is bad science. Everyone else was convinced only after learning about the conflict of interests.
Next, where were the scientists to rip this man to shreds? Perpetual motion machines can't exist, and any scientist should know that. Rather than berate the man and pick his invention apart for the junk it is, they just kept on working, hoping that this invention would be quickly revealed as fraudulent. I guess scientists don't understand Congress, or better yet, scientists don't understand the scientific knowledge of the average person.
Finally, the media, CBS specifically, is just as culpable as everyone else in this case. Twice they aired Newman's story, and in neither did they portray him as the fraud that he is. And regardless of how nice he is or how much he believes in his invention, you would like to believe that our media would call his bluff too rather than spread misunderstanding about perpetual motion. I suppose I shouldn't keep that much faith in the media.
I can only speculate about this, but people want to believe in bad information. Whether it's perpetual motion, ESP, UFOs, or Saddam/Al Queda ties, individuals can suspend all logical and reasonable thought to jump to the incorrect conclusion. Almost no effort goes into debunking the myths people believe or disproving blatantly wrong information. What does this mean for the information we find on the Internet? Or the results of polls and other statistics? Or for Joe Newman and his 500 pound piece of garbage?


Blogger Judd said...

Congress isn't alone in struggling with the issue of 'rotten information.' The 1993 case Daubert vs. Merrell Dow Pharmaceuticals set up judges as 'gatekeepers' for scientific information in an effort to combat the increase in the use of 'experts for hire' who would just say what they were paid to say, and turn cases into a 'your word against mine' sort of thing. (See this article from Investor's Business Daily for a nice review of the issues.)

The great thing about Daubert is that it pulls back the veil of scientific authority a bit, and recognizes that people (not just scientists) misuse information and abuse the popular notion of 'emperical' information. I'm always wary of the person who says something is a 'fact' and therefore beyond discussion and interpretation!

3:29 PM  
Blogger yardi said...

I agree with schloss' assessment that bad information is relative. This subject relates to one of the readings for this week, Sokal's A Physicist Experiments with Cultural Studies. In this study he writes a bogus scientific paper which is published in the journal, Social Text. His point is that academics (particularly those on the Left) are lazy and choose to accept social and political constructions, especially if they are written in "obscure and pretentious language", even if they are completely false.
I think this brings up an interesting view of quality of information, in which assigning a level of quality to something becomes inherently subjective. For some, information may be very valuable; to others, it is worthless. Should information always have to be assigned an absolute value of "good" or "rotten"?
Sokal's paper
isn't such a good example of this, nor is schloss' example, because they are both proved wrong by physics. But what if we reversed the scenario and had a social scientist write a bogus paper and had a bunch of physicists read it, would they find it to be worthless? Probably not.
My undergrad major was in computer engineering, so I have a feeling that if Paul and Geoff assigned such a reading, I probably wouldn't catch it. Maybe I am just trying to justify my own weaknesses here. :)

11:27 AM  
Blogger Joseph Lorenzo Hall said...

One other thing to consider: there are many disciplines these days where there are only a few well-qualified people to judge the quality (or lack of) that something may contain. For example, imagine a paper on quantum computing or string theory... these subjects are so complex that 95% of the time anyone is talking about them, the person talking about them doesn't know jack shit about the subject, only high-level details or features (I made up the 95% figure).

11:37 AM  
Blogger Paul Duguid said...

In response to Yardi, I am as confident as he is that if he were to post a good computer science hoax, it is as likely as not that I would be unable to detect it.

I should add that Geoff and I have not--at least to the best of our knowledge--included any papers that are hoaxes.


2:35 PM  
Blogger Geoff Nunberg said...

Speak for yourself, Paul.

My own experience of Daubert has been a little depressing. I once wrote a letter, pro bono, on behalf of a woman who found herself in a custody case in which the other side had hired a linguist to testify as to the authorship of some letters. In my view the guy was a bogus quack (on second thought, strike the "on my view"), and I said so in the letter, but when last I heard his testimony had been accepted --presumably since he had a PhD. And I've been the target of a Daubert charge myself, though it never actually was taken to the court, presumably since I have a PhD myself.

In fact these cases often come down to a question of duelling credentials, leaving the court free to make up its own mind. This was the fate of the case I was involved in when I served as expert for the Indians who were challenging the Washington Redskins' trademark on the grounds that the Lanham act prohibits the registration of marks that are "disparaging." We assembled a pretty impressive brief to show that "redskin" was a disparaging term, if I say so myself, but in overturning the TTAB ruling that threw out the mark, the DC district court ruled that since there were experts on the other side who claimed it was a respectful synonym for Indian, the status of the word was uncertain.

Over and above the legal problems, these disputes underscore the diffiulty of evaluating expert authority in the general case. When you look at the press discussions of global warming, say, you often see "experts" lined up on each side by a press that's steeped in the ideology of "balance." (There was a study of press treatments of this subject I heard someone talking about on NPR the other day, but I haven't been able to put my finger on it.) And the problem seems to be exacerbated on the Web. Anyway, this is something to bear in mind when we come to talking about Lippman a few weeks from now.

9:28 PM  
Blogger Catherine Newman said...

I was particularly stricken by the discussion of transparency in class last week. If I remember correctly we spoke briefly about trusting sports scores but at times completely dismissing information based on its source, a bit of the boy who cried wolf syndrome. In my view if society were better able to gage the level of transparency associated with an information source we might not suffer the social betrayal we discussed in class. While I can’t remember the case mentioned in class, the Tuskegee Syphilis Experiment seems like an example on that level of betrayal.

So to say that bad information is relative, relative to what? To the truth? Relative to who is saying it? (even if the government is saying it?) Relative to how many people are saying it? I’ll argue that is has to be a combination of factors, greater than one, for an individual to gage their level of confidence in a piece of information.

Lastly, if basic statistics was taught in high school along with basic banking and how to pay your taxes, people wouldn’t have so much faith in those opinion polls we hear about everyday.

8:05 PM  
Blogger Calvert Jones said...

I’d like to know what everyone here thinks should count as “rotten” information, because I think we would have to be in agreement on that definition in order to discuss whether the quality of such information is relatively assigned. Is bad information to be understood, by definition, as useless, constructed with the intent to deceive (consciously or subconsciously, regardless of whether it’s “for a good cause”), simply inaccurate, repellent and coercive, or a combination of all of these things? I’m curious, because Yardi’s comment (“For some, information may be very valuable; to others, it is worthless”) suggested that “rotten” information could simply be defined as information that is useless.

I don’t agree that assigning a level of quality to information is inherently subjective. Although I do think such assignations can be subjective, I would not say that bad information is relative, as Schloss suggested in the original post. The definition seems crucial here – if we consider information that is simply useless to us in this moment to be “rotten” information, then that would be a subjective judgment call. But I believe that plenty of information is just plain bad – objectively bad – regardless of whether you recognize it as such. For example, someone who tries to claim that A and B entails C would, in my opinion, be spreading objectively rotten information. But suppose I was under the false impression that B=C, and decided to believe this person. Does that make bad information relative? No, I don’t think so. In this case, I would believe that the information was good, but in fact I would be wrong. The information, being logically invalid, would be objectively bad. If only life could be as simple as elementary logic. But still, just because I couldn’t see the bad logic at first does not make bad information relative; it just means that we are easily duped into thinking patently bad information is good. So the media frenzy over the perpetual motion machine would, in my mind, also count as objectively bad information.

In response to Yardi’s question, “Should information always have to be assigned an absolute value of "good" or "rotten"? No, not always – we are undoubtedly arriving at many of our conclusions regarding the quality of information subjectively, but I do think that there are many cases where we can assign quality objectively, or at least close to objectively, provided we have training in logic, principles of reasoning, and a critical eye.

10:37 PM  
Blogger laura said...

Not to beat a dead horse, but I, too, am interested in the idea of the relativity of "rotten" information, especially when we're talking about hoaxes. I'm trying to figure out where the line is between "good" and "bad" hoaxes. For example, The Onion. It's great, because we're all in on the joke... but does it become a "bad" hoax when someone uses it as an authoritative source, like the reporter for the Beijing Evening News? Here's a story from Wired News about that, and other people who believe The Onion, and why (and links to further stories about hoaxes):,1284,63048,00.html?tw=wn_tophead_1

(Is this issue compounded because the reporter plagiarized The Onion? Layers upon layers of rotten information!)

One of my favorite places is the Museum of Jurassic Technology in Los Angeles. Their website is here:

The entire place is a hoax.... or mostly so. Some exhibits contain "true" information, and many of the exhibits are on such obscure (and boring) topics that the end effect is that the observer often doesn't care whether or not any of it is true. For many people, though, who are unaware that the museum is a hoax, the place is a nice afternoon diversion through a series of natural-history exhibits. Is this bad and misleading? Is it not as bad since the museum's visitors aren't likely to make use of the information in any way that would then mislead further people? Or is it worse, because some people are in on the hoax and some aren't, leading to a sort of class system of information haves and have-nots? One can make all sorts of arguments about art and assumptions of authenticity in museums and relativity of information and skepticism, but some would say that in the end, the museum is full of lies and intentionally deceives people. Where is this line?

6:30 PM  

Post a Comment

<< Home