Impact of Social Sciences – ‘Google Scholar is a serious alternative to Web of Science’
An interesting piece, challenging some of assumptions about Google Scholar (some of which doubts I share). Of course, deciding which of the alternative ways to count publications and citations is a bit like deciding how to be measured for your coffin.
Many bibliometricians and university administrators remain wary of Google Scholar citation data, preferring “the gold standard” of Web of Science instead. Anne-Wil Harzing, who developed the Publish or Perish software that uses Google Scholar data, here sets out to challenge some of the misconceptions about this data source and explain why it offers a serious alternative to Web of Science. In addition to its flaws having been overstated, Google Scholar’s coverage of high-quality publications is more comprehensive in many areas, including in the social sciences and humanities, books and book chapters, conference proceedings and non-English language publications.
the over-reliance on citations as a measure of scholarly impact seems to be a separate impact. Citations do tell us something. And for humanities and social science folks, especially those in fields where book publishing (monographs and edited collections) are a major part of the environment, the fact that Google Scholar also captures citations in book chapters and books is incredibly significant.
For most science disciplines and economics, this makes not difference. But not capturing citations in monographs and edited books makes citation data almost invalid (and certainly unreliable) as an indicator for many humanities and social science scholars.
Yes, this is important, and measures that don’t take this into account are certainly flawed. Google Scholar is not without its problems too though – with me it includes citations to books I’ve cotranslated or where I’ve edited a Translation, which overinflates my figures. But more generally I have a problem with how these numbers are used. However if they are, and that seems inevitable, then it is doubtless important to get them as accurate as possible.
I have a lot of problems with how they are used to. I suspect one can use the inaccuracies of the data as one argument against their dominance, especially in disciplines where they are particularly problematic. And fight hard against the “but it’s the best data we have” response. Bad data is never better than qualitative assessment.
Of course the other major problem with how these numbers are used is timelines. The evaluation processes are often happening at a point where it is unreasonable to see any significant citation activity. After all if a paper comes out today. It is going to take time for people to find out it exists + find time to read it + ponder the implications for their work + do the work it has influenced + write the paper + get the paper through peer review & revisions + get the paper out.
there may be short versions of some of those steps, but the most profound impact is going to make people rethink their research and the outputs are going to be several years down the road. I suspect even the Google Scholar data is good enough to use the little graphs they produce to make an argument of this type. I know, for example, that the peak of citations of my papers was 10 years after the last thing I ever published in a scholarly journal.
For more experienced, scholars such as yourself, some of the timelines might be reduced because of the number of people already familiar with your work and actively seeking out new publications. For early career scholars, I suspect the timelines are longer.
All very good points – thanks! Putting preprints up online can help with cutting out the long delay at the beginning of the timeline.
I have a bit of exchange with Anne-Will. “The point about the ISI is certainly true. Some stellar work in the social sciences in not covered. It is, therefore, a poor measure. Pretty much everybody I know used Google Scholar – particularly in the hope of finding a non-firewalled copy of the the paper or chapter they need. Because the social sciences lag so far behind in many aspects of journal publication (selecting ethical outlets; prioritizing open access; avoiding APCs when they do publish OA) I have produced a listing of cheap or free and reputable journals in 6 fields of social science, with a more generic SS list as well. A lot is at stake if we do not act to reduce pressures on library budgets, by ‘taking back’ publishing. https://simonbatterbury.wordpress.com/2015/10/25/list-of-open-access-journals/ “