Bibliometrics is a fairly new scientific discipline and is defined as the “application of mathematical and statistical methods to books and other media of communication”. Very often it is applied to scholarly publications in order to investigate communication in science. In this case the most important component of a scientific work is the citations, the works that it cites and how often it is cited by other scientific publications.
The usual presumption is that the more citations a work gets the more impact it has. And more impact is most often interpreted as high performance and high quality.
But lets look at some numbers: The average number of citations per journal publications for the years 1964-1972 is 8.44 for the field of polywater research. In pulsar research, the average number of citations per journal publications is 8.50 (from 1968-1969), so according to citation behaviour these fields seem comparable in quality for the given periods of time.
The mere claim that something like polywater (meaning polymerized water) exists turned out to be completely wrong, though, and lots of scientists had suspected all along that it was just due to contaminations and outrageous claims. In the case of polywater the number of citations per publication gives a very distorted picture of the quality of the conducted science.
The problem that becomes clear from this case is that we don’t know what a “citation” really is. In which way can the number of citations reflect impact and quality of science? Citing somebody does not happen in empty space, it is always a social act, something that scientists learn from their supervisors, something that depends on the “scientific culture” they are in.
Taking into account how the distribution of money in science depends on the number of citations, this is an extremely urgent question to be answered, in my opinion.
More information on polywater can be found on wikipedia:
Analysis of the polywater literature:
— Leonie Mueck