26 February 2012

Buffing

There's nothing like buffing your CV, albeit there might be some embarrassment when - as Zepinic, Wilce and Papows discovered - people eventually ask inconvenient questions.

Reuters draws on two Obstetrics & Gynecology articles (alas not readily available at UC) in reporting that studies of applications to training programs in obstetrics in the US indicate that up to 30% of applicants took credit for research publications that could not be found.

Unsurprisingly
"Our hope is that these are honest mistakes and not willful attempts to mislead," said Dr. Michael Frumovitz, a professor at the University of Texas MD Anderson Cancer Center in Houston, and lead author of one of the studies.

In a field where precision is important, "even if it's an honest mistake it's very troubling," he said.

Earlier studies have found that other specialties within medicine suffer from the same problem.

Anywhere from one to 30 percent of applications to training programs in radiology, emergency medicine, orthopedics and others include references to published research that can't be located by reviewers.
Frumovitz's team drew on all 258 applications to a MD Anderson gynecologic oncology fellowship program from 2004 to 2008, involving physicians who had completed their medical school and residency training. 44 of the 148 doctors who indicated that they had published research findings included a reference to a publication that the team could not find.

A similar University of Washington study headed by Anne-Marie Amies Oelschlager noted that 357 of 937 applicants to a residency program in obstetrics and gynecology stated that they had at least one research study published or about to appear in a peer-reviewed publication.
When Amies Oelschlager's group went to find those publications, 156 of the 1,000 publications listed turned up missing.

They looked online, in publication databases and even contacted the journal for verification.

Of the other publications that were confirmed, the researchers found inaccuracies there as well.

The biggest error was that 62 applicants had listed a publication as "peer-reviewed" when it wasn't.
Oelschlager is reported as commenting that
The best you can assume is that these applicants didn't look up what peer review meant or they don't understand it. None of that is flattering and you worry whether they really understand the tenets of authorship, research, what is peer review and what is not.

Applicants might be deliberately padding their resumes to try and get a spot, and it's concerning. The whole thing about being a physician is that you are expected to be honest.
I wonder about buffed CVs for law schools and private/public practice.

'Unverifiable and Erroneous Publications Reported by Obstetrics and Gynecology Residency Applicants' by Simmons, Kim, Zins, Chiang & Oelschlager in 119(3) Obstetrics & Gynecology (2012) 498–503 aimed to "estimate the rate of erroneous and unverifiable publications in applications for an obstetrics and gynecology residency and to determine whether there were associated characteristics that could assist in predicting which applicants are more likely to erroneously cite their publications".

The authors indicate that -
This was a review of the Electronic Residency Application Service applications submitted to the University of Washington obstetrics and gynecology residency for the 2008 and 2009 matches. Publications reported to be peer-reviewed articles and abstracts were searched by querying PubMed, Google, and journal archives (first tier), topic-specific databases (second tier), and by e-mailing journal editors (third tier). Errors were categorized as minor, major, and unverified.

Five-hundred forty-six (58%) of 937 applicants listed a total of 2,251 publication entries. Three-hundred fifty-three applicants (37.7%) listed 1,000 peer-reviewed journal articles and abstracts, of which 751 were reported as published and 249 as submitted or accepted. Seven-hundred seventy (77.0%) publications were found by a first-tier search, 51 (5.1%) were found by a second-tier search, 23 (2.3%) were found by a third-tier search, and 156 (15.6%) were unverified. Of the 353 applicants listing peer-reviewed articles or abstracts, 25.5% (90 of 353) committed major errors, 12.5% (44 of 353) committed minor errors, and 24.1% (85 of 353) had articles or abstracts that were unverified.

Most applicants reported their publications accurately or with minor errors; however, a concerning number of applicants had major errors in their citations or reported articles that could not be found, despite extensive searching. Reported major and unverified publication errors are common and should cause concern for our specialty, medical schools, and our entire medical profession.