Journal Impact Factors - a good free tool

Recently, I've taken on more consulting work outside my own immediate area., a free impact factor tool, has been incredibly handy. Here's why.

Getting to grips (well on some level, at least) with a new system is a bit exciting and not a little empowering, too - like the first time you really understood crystalization as a kid (remember those copper sulphate crystals in the jar?)

The problem is that journals always fall into four categories in my book;

  1. Top level ones like Science, Nature, PLoS and PNAS,
  2. Reviews and stuff that are usually a good place to start,
  3. Key articles in specialist journals, and
  4. Crapola which you don't need to bother with (to start with, at least).

The trouble is, while 1, 2, and 4 pretty much find themselves, working out which journals to look in for the specialist stuff when you start in a new field is pretty hard. For instance, the Journal of General Virology, Journal of Virology, and Virology all deal, obviously, with viruses and their biology... but which is the more authoritative?

The Impact Factor

If you've trained as a scientist, you're probably sagely muttering 'impact factor'. If you've ever worked as one you're probably screaming it.

So what is this 'impact factor'? Sounds like something to do with ballistics. Basically it's a measure of the amount of influence a given published scientific article has on other articles. Since an article's authors reference (or 'cite') other articles from all kinds of journals and books for background information and to support their own assertions, it follows that an article considered to be important to professionals in a field will be cited more frequently than an irrelevant one.

So good articles are cited more frequently. That helps us find those (both and will tell you how many times a given article has been cited). And it's a fairly simple matter to aggregate the mean number of citations per article in a particular journal and express that as a ratio or percentile of others in its field (you can also apply the same process to deciding whether or not to hire a particular scientist if you're an institution or funding body - a nasty and growing trend which explains the screaming mentioned above...)

Use your judgement

There is a whole set of complex arguments about the best way to do that, and I won't go into them here, not least because in my opinion at the end of the day you should always use your own good professional judgement when evaluating an article's importance - no impact factor can fully do that for you. Ask yourself:

  • Do I actually know enough about the area yet to work out in general what the hell this article means, let alone if it's any good?
  • If the citations seem particularly high (or low) for this journal/authors/general quality of paper, am I missing something?

If the answer to the first question is 'no' you'd better go off and read a few more reviews...


Anyway, why am I boning so hard at the moment? Well, a couple of reasons really:

  1. It's free
  2. It has good coverage, and
  3. A great search interface, which is simple to use.

There are a few other useful things about their interface and data filtering, but for me those are the three main reasons. The 'free' thing's great, obviously. But I really like the coverage they have and search interface because it quickly lets me find my way into a subject - when you start typing a journal or discipline into the box it autocompletes for you really smoothly. Ace huh?

Applying this to our virology journals, we find that their impact factors differ quite widely:

  • J. Gen. Virol. - 3.092
  • J. Virol. - 5.332
  • Virology - 3.765

So the Journal of Virology is the winner! Cool. Now I'm off to bone up on vif gene inactivation...

Add new comment