Yesterday, I blogged about new scholarship by Rob Willey, Melanie Knapp, and Ashley Matthews at George Mason University Law Library that explores how and why women are frequently underrepresented in law scholarly impact rankings and suggests alternative metrics to mitigate the imbalance.
Toward the end of the paper, the authors consider the merits of a ranking based on SSRN downloads rather than or in addition to law journal citation rankings. They correctly note that SSRN downloads:
- capture interest from readers beyond just those who might cite an article
- capture interest immediately rather than for a citing article to be published
- capture interest in interdisciplinary scholarship, not just law journals, thereby helping to mitigate database bias
- offer more control to authors on the inclusion of their scholarship
Despite these advantages, a ranking based on SSRN download still presents roadblocks that would prevent it from becoming a truly representative metric of law faculty scholarly impact.
Copyright issues can prevent law scholars from posting their work to SSRN. Although law reviews usually allow posting of pre-prints on SSRN, journal publishers in non-law disciplines may not. Many book publishers are also reluctant to allow posting of content to SSRN. While an author might be able to post a table of contents or maybe a chapter, that’s a limited indication of impact. And because posting requires an author to get copyright permission from the publisher, this may advantage law schools that have librarians or others perform this function.
As many law librarians discovered with HeinOnline, there’s also the issue of author identification. Not every law scholar that publishes scholarship posts their work to SSRN. Again, law schools that have librarians to assist with posting may be advantaged. And faculty at the same institution that do post to SSRN may not all list their affiliations in the same way. For example, someone at our law school might list their affiliation as the University of Wisconsin Law School, someone else as the University of Wisconsin-Madison, and still others list it in different ways. This may be inadvertent or for good reason, such as a dual appointment with another department. This means that some faculty could be missed in an institutional ranking if their affiliation is listed differently. Again, those with librarians to detangle these issues may be advantaged.
Although a ranking based on SSRN downloads may resolve some issues, there is simply no way to compile a truly representative metric of law faculty scholarly impact. However, if rankings like Sisk and others persist, which seems inevitable, then some combination of citations and downloads may be an improvement. Schools can use these rankings to tell their impact stories as they like.
However, I would strongly discourage U.S. News from viewing the inclusion of SSRN download data as a way to resurrect their proposed scholarly impact ranking. The stakes of a U.S. News-created metric that is not a true representation of scholarly impact are simply too high to accept. At present, every database, whether it’s Hein, Westlaw, Lexis, Web of Science, or Google Scholar, and scholarly repository, be it SSRN or bePress, presents major concerns that would create unfair exclusions. As the authors illustrate, there may be ways to mitigate the damage to some authors and schools, but not for all.
I salute my law librarian colleagues, Willey, Knapp, and Matthews for their continued scholarship on law scholarly impact rankings. As they note:
It’s important that we do everything we can to improve them by giving all groups a chance to rank well. Opening the door to currently excluded groups may also encourage members of those groups to contribute more scholarship, adding valuable contributions that we’re currently missing. We’ve been using similar ranking methods for years, waiting for society to change. We can stop waiting and address the problem now simply by adjusting our ranking systems.