Wednesday, December 08, 2004

Lowering Information Quality Expectations

This morning I was talking with a friend in the SIMS lounge, discussing the ideas behind the final paper that I've been writing with Dave. At one point, he stopped me and asked if I could help him solve a quality of information problem. He is one of the contributors to a group re-blogging site, Apparently many of the visitors to the site are upset when they find out that the contributors do not verify the information posted on the site--they simply find interesting news, ideas or projects and post excerpts of other author's work to their site, with attribution and a link to the original text.

Perhaps looks too professional, or perhaps external content is not clearly labeled--but I read the site, and it appears clear enough to me. It may be that readers trust the contributors to post only 'quality' information, and to filter out 'rotten' information. The issue is not that people cannot tell the difference between the two--if it was, they wouldn't be complaining--it is that the information presented has gained the credibility by being posted on the site.

How do we lower the expectations of readers on such a site? They still want to be able to post information that may be based on rumors, just to let people see what is out there, but do not want to give the impression that everything is of the same 'quality.' Ideas?


Blogger Jono said...

That's an interesting 'reverse' quality problem. I think people regularly experience a similar problem when they read, for example, a professional looking homepage or say, when prospective students are looking at research lab homepages. Very often, in my experience, these sorts of pages are just not updated regularly and therefore commonly reflect old views or not current information. One of the problems is that you, as the web page owner, do not know what effect your web page is having so you do not know whether to act on it.

For example, maybe many students don't apply to a University lab due to out of date information published on a webpage, when actually the lab is very dynamic they just don't pay a lot of attention to thei web presence.

Some sort of feedback would be great - the very fact that he heard from these users is already a great start - but also a simple notice that the content posted is 'not verified' may also help (though it may turn away those users...)

3:56 PM  
Blogger Paul Duguid said...

This comment has been removed by a blog administrator.

6:58 PM  
Blogger Joseph Lorenzo Hall said...

One way of monitoring if a page is being read is traffic analysis... how does one not experienced with the intricacies of this subject pull it off? Well, unbeknownst to many, I put a traffic analysis tool on this blog shortly after it was launched. See the little planet off to the right? Click on that... it will give you summary and detailed statistics about the readership of this blog.

1:13 PM  

Post a Comment

<< Home