In 1665, the Royal Society launched the first scientific journal, Philosophical Transactions, which helped launch the career of Isaac Newton, who published no less than seventeen papers there.  For those of us in academia who have spent the greater part of their lives creating articles to appear, hopefully, in journals such as that, it can create an existential crisis that after 350 years it is all coming to end, though we can all agree it has been a good run.  The usefulness of the scientific paper is on the wane, partly by suicide via the slow train of informational obesity.

Over the last several months, I have been reading the overwhelming well-written book by David Weinberger “Too Big to Know.” Now, the book isn’t that long, just 70,000 words and I read at a decent pace, but after a chapter or two, I often had to put it down. It was too emotionally challenging. The full tagline of the book includes “Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts Are Everywhere and the Smartest Person in the Room is the Room.” A philosopher by training and a fellow of Harvard’s Center for Internet and Society, David’s insight into the internet is remarkable.

While the book covers many themes, the one closest to my heart is the production of knowledge, a task I’m ostensibly paid to do and at times I actually achieve. I’m rewarded with salary and a modicum of recognition for the science I serve, but increasingly I wonder about the efficacy and impact of what I do, of what we all do. We produce so many articles and this glut makes every previous and following effort less valuable in itself. On top of our shoulders, evolution has negotiated a remarkable tradeoff between energy requirements and our ability to understand. Using just 20 Watts of power, our brains have built civilization to this point. People could read enough, learn enough to become experts, to build towers, design vehicles and understand a field sufficiently to provide insight and advancement. This moment is rapidly passing. Our production of new knowledge, such as contained in the scientific paper, is so vast and interconnected that no one can hope to obtain much less absorb anything but an increasingly arbitrary fraction of what is relevant to their inquiry. Most papers we produce become lost, like Roy Batty’s tears in the rain. Many resort to excessive specialization, becoming a complete expert but on a miniscule topic that stands relatively next to nothing.

Other consequences of this is that advancement requires the efforts of a media campaign, where a single insight or improvement over past practices takes scores of papers and intellectual allies to take hold. To become influential, a paper rise through a market swamp of ideas, with the inevitable outcome of most sinking. As scientific fields produce evermore articles, the number that any one of us can read remains static and our reading lists largely overlap. My own field of IO psychology, for example, appears to be stuck in the 1980s on many topics, waiting for the old guard to pass away so that the new can finally reach to the top of the canopy. Meta-analysis will serve its purpose here.

Though we are increasingly running up against the limits of what 20 watts of chemical energy can do with gray matter, the only way to move forward is to acknowledge our own intellectual limitations. Unable to cope with the information tsunami that is occurring, with some estimates indicating that 90% of all data was created in the last two years, we need a way to summarize and condense what has been done. Models and simulations are one path, as updates can absorb and retain almost limitless complexity and the insights of entire fields. Another is automatic or AI infused algorithms that crunch “Big Data” to identify patterns and equations, though often expressed at a level of inscrutability where we know not why. And, of course, meta-analysis.

To unlock and solidify what we have learned, reading PDFs alone is not the path. With meta-analysis, we escape the science theater where our production of knowledge is technically meaningful to society but primarily serves the instrumental goal of increasing an institutional ranking or a means to tenure and promotion. The tail wags the dog as we conduct the appropriate research to get into an A-level journal rather than doing A-level research and then seek an appropriate journal. For every meta-analysis we conduct, a hundred or more previous papers becomes redundant but the information within them becomes tractable and we regain our original mandate, the societal contribution that was the source of most of our initial inspiration.

In short, this is what HubMeta as well as other efforts like RoboReviewer are seeking. We want to make meta-analyses faster, incremental, more accurate and ultimately more influential. Part of this is improving our archaic software and processes that have for far too long served as our foundation. The other part is networking together larger teams to conduct them. With both, we can tackle the hellish backlog, as really every paper should already be coded into a database. And yes, your personal careers will flourish because of this too. The bigger your projects, the better the rewards, so think big.

Categories: post blog

Leave a Reply