In 1945, Vannevar Bush, the man that led the nation’s scientific efforts during World War II, delivered a proposal to President Truman for funding scientific research in the post-war world. Titled Science, The Endless Frontier, it led to the formation of the NSF, NIH, DARPA and other agencies.
The payoff has been nothing short of astounding. The NSF has funded innovations such as barcode scanners and next generation materials. NIH backed the Human Genome Project as well as research that has led to many of our most important cures. DARPA, invented the Internet.
It’s an impressive record, but the future doesn’t look nearly as bright. Part of the problem is funding, which has fallen off in recent years. Yet the practice of science also needs to be updated. Much has changed in the 70 years since 1945. In order to honor Bush’s legacy—and maintain our technological leadership—we need to adapt it to modern times.
The Bush Model
For most of history, scientists were men of means. Although some from of humble origins, like Gauss and Faraday, managed to slip through, it was mainly wealthy people that had the resources and leisure time to pursue serious inquiry. In the 20th century, the circle was expanded to a small group of university researchers, but it was still an exclusive club.
World War II changed all that. As head of the OSRD, Vannevar Bush led government into the science business, funding enormous research projects that led to the development of proximity fuzes, radar and, most famously, the atomic bomb. After the war ended, the question arose about what form, if any, scientific funding should take in peacetime.
Bush noted that most research performed in industry and government was of an applied, rather than a theoretical nature. He also argued that without vigorous funding for basic research to expand the frontiers of knowledge, advances in technical applications would be limited, endangering our national security, health and economic well being.
So the architecture he envisioned would fund research at outside institutions, rather than within government or industry. Grants would be given out on a multi-year, rather than an annual basis, to provide stability, and research would be published widely to ensure dissemination of knowledge.
Bush’s architecture served us well and transformed the US into a scientific and technological superpower. However, to maintain supremacy, we need to innovate how we pursue discovery.
Fostering Broader And Deeper Collaboration
One assumption inherent in Bush’s proposal was that institutions would be at the center of scientific life. Scientists from disparate labs could read each other’s papers and meet at an occasional conference, but for the most part, they would be dependent on the network of researchers within their organization and those close by.
Sometimes, the interplay between institutions had major, even historical, impacts, such as John von Neumann’s sponsorship of Alan Turing, but mostly the work you did was largely a function of where you did it. The proximity of Watson, Crick, Rosalind Franklin and Maurice Wilkins, for example, played a major role in the discovery of the structure of DNA.
Yet today, digital technology is changing not only the speed and ease of how we communicate, but the very nature of how we are able to collaborate. When I spoke to Jonathan Adams, Chief Scientist at Digital Science, which develops and invests in software that makes science more efficient, he noted that there is a generational shift underway and said this:
When you talk to people like me, we’re established scientists who are still stuck in the old system of institutions and conferences. But the younger scientists are using technology to access networks and they do so on an ongoing, rather than a punctuated basis. Today, you don’t have to go to a conference or write a paper to exchange ideas.
Evidence would seem to bear this out. The prestigious journal Nature recently noted that the average scientific paper has four times as many authors as it did in the 1950’s, when Bush’s career was at its height. Moreover, it’s become common for co-authors to work at far-flung institutions. Scientific practice needs to adopt to this scientific reality.
There has been some progress in this area. The Internet, in fact, was created for the the explicit purpose of scientific collaboration. Yet still, the way in which scientists report and share their findings remains much the same as a century ago.
Moving From Publications To Platforms For Discovery
One especially ripe area for innovation is publishing. Typically, a researcher with a new discovery waits six months to a year for the peer review process to run its course before the work can be published. Even then, many of the results are questionable, at best. Naturerecently reported that the overwhelming majority of studies can’t be replicated
Many are calling this a “replication crisis.” Duncan Watts, a Principal Researcher at Microsoft Research says “Journals are heavily biased toward novel findings. There’s no incentive for a scientist to replicate another scientists work. It’s hard to get such work funded and it’s hard to publish it. It’s almost unheard of to build a career on checking other people’s work.”
A recent episode shows just how important this issue is. In 2010, Carmen Reinhart and Kenneth Rogoff, published a working paper which warned that US debt was approaching a critical level. As it turned out, they had made a simple Excel error, but if the issue hadn’t been so politically potent, the mistake would probably have never been found.
Clearly, we need to go beyond publishing papers with charts and tables and move to platforms that incorporate open data as well as negative results. As Jonathan Adams of Digital Science puts it, “We need to move to a rolling process that doesn’t start and end with publication and includes negative, as well as positive, findings.”
We’re starting to see some innovation in this area. ArXiv, a project at Cornell University, allows scientists to self publish with post-moderation. Macmillan, a top publisher recently announced an interesting new initiative and tools like iPython Notebook are specifically designed to foster better collaboration. Yet there is still much work to be done.
The Politics Of Scientific Investment
When Bush wrote his famous proposal for public financing of scientific research, we had just defeated Hitler and his allies with help from scientific miracles like radar and the atomic bomb. That created enormous support for the public funding of science to enhance the private sector as well as national defense. Times, unfortunately, have clearly changed.
Today, there is a veritable war on science, with politicians quick to score cheap political points by accusing researchers of waste or merely declaring that science is irrelevant to everyday concerns. Ed Lazowska, who co-chaired President Bush’s Information Technology Advisory Committee sees this as a fundamental misunderstanding of how innovation takes place.
“There’s a big misunderstanding of today’s industry R&D, most of it is ‘D’ rather than ‘R,’” he told me and points to Google X as an example. Although Google has taken on major business risks with things like autonomous cars, the basic research was funded by the federal government through places like NSF and DARPA, so there is little technological risk.
He also points to the principle of appropriability as a foundation for thinking about scientific funding. Enterprises understandably have a bias for investments from which they will benefit directly. That’s why public funding is the most viable source of support for basic research, which leads to applications that are not only broad, but often unforeseeable.
Lazowska notes further that market leaders may have an interest in investing in scientific inquiry that benefits their industry broadly and highlights Microsoft and IBM as two firms that invest in basic research and publish openly. Yet they are the exception, not the rule.
Promoting Exploration And Discovery
When Vannevar Bush created the OSRD, the agency which oversaw wartime research, the funding mechanism he created was innovative, even revolutionary. Rather than have government bureaucrats directly supervise scientists, it issued grants to research institutions for specific projects. It was a system based on responsibility, not subservience.
In his proposal for the formation of a new agency, (which came to be the NSF), he was so adamant that scientists themselves would approve funding that it held up the legislation for several years. In the interim, the military took over much of the funding for scientific research.
Today, the military dominates the US government’s research budget, making up slightly more than half of the total. The NSF, by comparison, only accounts for 4%. That’s led to exactly the situation that Bush had feared, an excessive focus on developing applications rather than expanding frontiers in public scientific funding.
The situation represents a serious threat to our national well being. As Lazowska stresses, “Market leaders are likely to derive the most benefit from expanding the frontiers of knowledge.” The US, being the world’s undisputed technological leader, has the most to gain from new discoveries and the most to lose from a neglect of basic scientific research.
In many ways, the issue of scientific funding mirrors the current controversy over auditing the Federal Reserve Bank. The proper role of politicians is to reflect the public will, not to meddle in the technical work of specialists.
How To Move Forward
Science is a necessarily conservative enterprise. The scientific establishment, while slow and plodding, also serves a purpose—to protect research from corruption by a cadre of propagandists, kooks and corporate rent-seekers who would exploit its integrity to serve their own purposes.
So while we do need to transform many of the practices that got us to where we are now—such as how scientists collaborate and make their discoveries known to the world—we still need to stay true to Bush’s vision: the funding of basic scientific research to provide the “seed corn” from which exciting new technologies bloom.
As Steve Strogatz puts it, “When you do something transformative, it usually comes out of left field.” So we need to look beyond the mere applications of science—the end products that make modern life possible—and learn to value the wonder of discovery once again. It is by expanding frontiers that we better our everyday lives.
Most of all, we need to accept that we all have a stake in the public funding of science. It is, after all, government funding that made the iPhone possible, has led to miracle cures and blockbuster drugs and decoded the human genome. We’d all be poorer without it.
This article originally appeared at DigitalTonto.
Follow Greg on Twitter @Digitaltonto