Editor’s Note: Today’s post is by Sarah Andrus, a Publisher of Science and Medicine journals at Oxford University Press.
Publishers are often accused of being too slow to innovate, but it’s hard to blame them when the discrepancy between what their core audience says they want and what they really want in practice is so perplexing. One might say that publishing culture echoes the conservatism of academic culture, and very rarely does the former significantly influence the latter. It may seem paradoxical that the research community, which includes scientists working on some of the most cutting-edge problems of today, would be resistant to change; but generally speaking the resistance is not to change itself, but to what may be seen as any gratuitous disruption to the publishing and communication workflow, which to many scientists is but a necessary corollary to the core research and discovery activities that really matter.
Consider the research article. The Internet has undoubtedly revolutionized how we access and publish scholarly information, but has had surprisingly little impact on the underlying form in which we consume it. The formal structure of the research article has resisted any fundamental change for centuries, despite a number of ambitious attempts to boost adoption of technologies that take real advantage of the dynamic capabilities of the post-digital era. It’s hard to disagree with some of the futurists’ arguments here: With all the incredible tools available to recreate the modern article as an interactive, living entity, why is the PDF — a digital facsimile of print — still the dominant format? Why is it necessary for researchers to spend so much time reading and writing lengthy, austere papers when a far more streamlined, visual approach could relay all of the relevant background and findings in a much more impactful way? What might be possible if it were much easier to publish and discover negative results, data sets, code, and other research outputs that have no inherent need to be tied to a formal article?
If the PDF is such a clunky, obsolete relic of the print era, then why won’t it die? And does it need to die in order for exciting new models to gain any significant traction?
Furthermore, what can we agree on as the fundamental purpose of the research article, and where does the “form follows function” approach lead us in terms of practical innovation?
A fairly recent article in the Atlantic asked, “What would you get if you designed the scientific paper from scratch today?” The clear premise is that the traditional scientific article is hopelessly obsolete, given that today’s research is increasingly dependent on computational methods and data visualization that is difficult or impossible to convey in a static article. The proposed solution is the widespread adoption of “computational notebooks” as the new standard for sharing research outputs, using sophisticated software like Wolfram’s Mathematica and the open-source Jupyter (formerly known as IPython) to create dynamic representations of complex models that come alive on the screen. Not surprisingly the early adopters of these notebooks are mainly computer scientists, mathematicians, physicists, and others whose research involves large amounts of data to model, but proponents argue that arts and humanities scholars can equally benefit from the freedom of expression that is made possible. It’s an alluring idea, and tempting to conclude that it’s so obviously the future of publishing that we may be surprised that this future has not yet arrived. And yet the author concedes that “It’ll be some time before computational notebooks replace PDFs in scientific journals, because that would mean changing the incentive structure of science itself.” That stubborn incentive structure again — this is why we can’t have nice things!
A lesser-known startup called Claimspace, founded by a former Twitter engineer, adopts a similar philosophy to smart notebooks but bears even less resemblance to the traditional research article. With the highly ambitious mission “to maintain a unified map of human understanding at its best, in real time,” Claimspace invites anyone to contribute to threads of knowledge on any topic using basic lines of code to create a logical line of reasoning, based on (and linking to) fully attributed and peer-reviewed facts and studies. In its openness and inclusiveness of authorship it may more closely resemble Wikipedia than a journal, but it is unique in its simple presentation of knowledge as a logical progression of verified facts, eschewing the narrative/discussion aspect all together. Not to suggest that a detailed discussion of findings is not often critical to understanding; but one might imagine such a “logic map” as a layer on top of a more detailed article (whatever form that may take), allowing readers to easily explore a field and the individual findings and assumptions upon which it is structured.
Returning to the traditional publishing world, a famous example of a much-hyped “new model of journal publishing” was Elsevier’s Article of the Future, which debuted in 2009 and was met with a lackluster response, with users largely unimpressed by what amounted to a jazzed-up new look with very little real innovation attached. Kent Anderson observed in an earlier post on this blog that “The problem with the premise…was that it focused on how an article written for print could be presented online, rather than taking the essence of the communication itself and shifting it from a print environment to a networked, digital environment.” So while companies outside the scholarly publishing industry are coming up with “article” models that are not tied to the legacy of print or any kind of traditional journal-based format (and therefore in the current environment have little hope of rapid, widespread uptake), Elsevier—a large “traditional” publisher that is perfectly capable of innovation — stopped short of truly re-thinking the journal article, perhaps because they knew that a complete overhaul would likely delight some tech-savvy enthusiasts but alienate a large core audience of researchers who may be very comfortable with technology and progress, but have little or no incentive to change their publication workflow when the current system, for all its imperfections, “just works.”
There are probably many concepts for new publishing models out there, at various stages of ideation and development — some undoubtedly more realistic than others — but a common characteristic seems to be a real sense of mission to transform forever how people share and interact with the outputs of knowledge and discovery, never again to return to the inferior ways of the past and present. It is worth reflecting, however, on how this approach (“I’ve created something demonstrably superior to the current model, and therefore the only rational outcome is for everyone to immediately discard the old and embrace the new”) has fizzled time and again, across industries and technologies. Computational notebooks are a worthy attempt at reinventing the form of scholarly communication to be more faithful to its function — assuming that function is to share ideas, methodologies, results, and relevant supplementary data at an appropriate level of detail and complexity for a given field — but the technology is possibly impaired by its own cleverness. Whether you’re Wolfram or Elsevier, to announce that some flashy new format is “the future” is to almost certainly guarantee disappointment, irrespective of its real merits.
Brilliant new ideas for scholarly communication should absolutely have a chance to shake things up and bring publishing closer to the core research and discovery workflow. But it feels counterproductive to view the traditional model as if its continued existence precluded the successful introduction of something new.
There is no evangelizing here. I would simply argue that it should be in the mutual best interest of researchers and publishers to be open to new forms of output that do not needlessly restrict the types of information and visualization that can be shared, keeping barriers to adoption (ease of use, access to tools and programs, time commitment) as low as possible. Behavioral differences between disciplines aside, researchers will always have diverse preferences on how to read and write scientific literature, and many will always opt for the simplest solution no matter what. The real argument, then, is not about when and how traditional formats like the PDF will be replaced; it’s about accepting that the familiar (and perhaps boring) research article still has its purpose, while at the same time thinking ambitiously and creatively about how the humble document can be supplemented with the modern features and functions that the digital environment offers.
I firmly believe — in fact I really hope — that the traditional, somewhat restrictive article publication model must at some point give way to something that is more intuitively suited to the way research is really done. Likely there will not just be one new model, but several, each tailored to the needs of its core scholarly audience. But if we as publishers assume that such a shift is dependent on a universal rejection of the traditional model, we are likely to be blinded to the development that is most likely to occur: that someone or some company (and it almost certainly won’t be one of the major publishers) will create something that not only challenges the core assumption of what a journal or an article should look like, but feels like a natural enough part of the research workflow to make extensive adoption possible; and that this new something will exist in the same world as the PDF article for some time before a full replacement seems remotely imminent. The best thing that publishers can do, rather than treating every innovation of this type with excessive skepticism and unspoken fear of losing control of the medium, is to prepare strategically for a scenario in which the format of the journal article is completely different, with a focus on establishing the future role of the publisher with respect to peer review, editorial management, ethics oversight, archiving and protecting records, and many other activities that are no less important (and perhaps more so) in the context of less centralized, highly networked modes of publication.
It is no doubt true what Kent writes in his Scholarly Kitchen post on the Article of the Future: “The thing that should scare Elsevier and every other traditional publisher is that they are not the ones doing those experiments [on the possibilities of the Article 2.0]. These experiments are being created elsewhere.” But perhaps rather than fearing outside innovation, we can embrace new models that benefit researchers and adapt our role accordingly without compromising our sustainability. Much easier said than done, but in an industry where change comes slowly there’s no excuse for being caught unprepared.