This page written circa 8 November, 2009.
I was asked to review a paper about two months ago by Hindawi Publishing. They have an EE journal that works in reverse to the usual, in that you pay to have your paper published, but the journal is freely accessible. No idea how they came to know about me, but there they were. I struggled through this paper. I was not overly impressed, a paper on CMOS design from a foreign-trained academic in New Zealand, and one that was purely theory and simulation, no fab, no measurements. Well, I guess you do what you can.
After more effort than I could really afford, I found a fundamental theory error. I returned my review, saying I would like to have other reviewers confirm my opinion. I mean, I secretly suspected that nobody would have put enough effort into the review to have checked the proofs in the appendix, but also I was not certain that I was correct. About the reviewers I was right, but about the integrity of the journal I underestimated. The editor was great, he forwarded my review to the other two reviewers. A reviewer out of Western Australia promptly responded that I was right, and this dashed much of the worth of the paper. More email traffic settled the matter, and I felt that all conduct had been fair and wise and timely. I have skipped over the maths in appendices myself before.
In the mean time, I had been writing up some work done by one of my students from last year, work that has been carried on by another honours student this year, and finished in September. It is a very nice piece of work on the design of a "gas gauge" for electric vehicles. Not revolutionary, but original, practical and well documented. I submitted the manuscript to an IEEE journal in which I have published before.
Fasten your seatbelts, this is where it gets exciting.
Within 24 hours I recieved an email reply from the relevant editor of this journal saying that it appeared that my manuscript "does not have enough current journal references". Suspiciously quick response, I thought, but maybe they were right. We had done a literature survey, and a good page of the manuscript is a thorough review of what is out there and what the shortcomings might be of that work. However, people use an array of acronyms and descriptions and it is possible we missed something.
I searched everywhere I could think of looking, Google, Xplore, the library database search tools. I was lucky, I had a couple of days I could spend working alone at home, something that does not happen often. After a few days of scratching around and reading papers, I turned up only one paper that held any close relevance, but some journal and 1 conference publications in total that I could reasonably tie into our review without it looking like I was citing gratuitously. If the original response was based purely on counting citations (you have to type these numbers in during the submission process) another four (4) journal ones ought to ring all the bells.
With hindsight I should have noticed that at least one of the new citations had a stupendously large number of references of barely-engineering relevance that tended to be cited in clusters, so: -. In any case, I resubmitted the paper, to a different section of the same journal, since it was not clear if this was "metrology" or "sustainable systems". Well darn it, if I don't get another rejection within 24 hours. A little work with a file-compare utility confirms that the rejection letters are absolutely identical, differing only in the date and a few digits of the submission number! You can read the letter if you like, it has the journal and the editor's name to satisfy your curiosity.
Now in case you are thinking that maybe I am submitting something a bit dodgy, please feel free to check out the manuscript yourself. This is the version after the second wave of bibliographic expansion.
These guys are basically saying "we need you to cite more of our papers in order for our impact factor to stay huge, and when you manage that we'll publish your manuscript". They go into some detail on their website to explain how the citation of papers increases their impact factor, and the apparent worth of what they publish of yours to the men who hand out the money. They are dead right. This is brilliant! For so long it has been lawyers and doctors screwing the system and we engineers have been left out, miserably maintaining our integrity instead of applying our intellects to juicing governments more completely. At last rigorous, quantifiable work has lead to a solution. The rejections are machine generated for sure, so these guys don't even have to count references themselves. Breathtakingly elegant.
Is it a co-incidence that the dean at UoW has recently started an amusingly tactless campaign to explain to all academic staff that they are expected to publish a certain amount (that they don't quantify) and if one's output is low one is not performing satisfactorily? I think not! He's on board.
As I type this, a wicked idea dawns on me. I have been wasting my time reading these papers before I select and cite them. What a waste of time that has been! I only need to cite them. Publish-or-perish does not scan the text of articles, as far as I can see. How dumb could I have been?
The revised version seems to have now passed the entry requirements, 24 hours and no rejection.