We cannot let the money-changers dominate our temples of science. No matter how counterintuitive it may seem, basic research remains the lifeline of practical advances.
When I reflect on the progress of biomedical science in the past century, I think of scientists as generations of hunter-gatherers. The first two decades of the 20th century were dominated by the microbe hunters. They found the microbes responsible for the dreaded scourges of tuberculosis, cholera, diphtheria. But there remained terrible diseases for which no microbe could be found: scurvy, pellagra, rickets. These diseases proved to be due to the absence of trace substances in the diet, called vitamins. And so in the 1920s and 1930s, the microbe hunters were succeeded by a generation of vitamin hunters.
By the 1940s, most of the vitamins had been discovered and nutritional science was in its twilight. The big question was: "What do the vitamins do?" In my own career, I bridged the transition from vitamins to enzymes. In the 1940s and 1950s, we enzyme hunters occupied centre stage, showing how the vitamins attached to enzymes enabled them to perform the vital metabolic functions essential for growth and reproduction.
Enzyme hunters have been replaced by gene hunters, the genetic engineers and biotechnologists. Biotechnology, with its fabulous contributions to medicine, agriculture and basic science, has become a multibillion-dollar industry. The current age of gene hunting is undeniably the most golden in the history of biomedical science. The term "revolutionary" is often overused; not here. The effects of this advance on medicine, agriculture, industry, and basic science have not been exaggerated.
Yet, just as revolutionary, and almost unnoticed, is a development that lacks a name, has no obvious applications, but will surely lead to equally remarkable and unanticipated practical applications: I refer to the coalescence, the confluence and the merging of the numerous basic biologic and medical sciences into a single, unified discipline that has emerged because it is expressed in a single universal language, the molecular language of chemistry.
Much of life can already be understood in rational terms if expressed in the language of chemistry -- an international language, a language without dialects, a language for all of time, and a language that explains where we came from, what we are, and where the physical world will allow us to go. Chemical and molecular language has great esthetic beauty and links the physical sciences to the biological and medical sciences.
In biomedical science today, we have a phenomenal capacity to acquire and integrate unprecedented quantities of sophisticated data. But invariably, the most crucial element in progress in biomedical science will rest with the individual scientist. And what's crucial is their institutional environment. They need up-to-date laboratory resources, and to be surrounded with kindred scientists to share ideas, techniques and reagents. It's essential for the institution, the community and the nation at large to assure scientists of a long-term commitment to foster first-class science. That means we must protect basic research in view of the growing connections between academia and industry.
With regard to medical research, the best plan over many decades has been no plan at all. The breakthroughs of recombinant DNA and genetic engineering, based on the discoveries of enzymes that make, break and seal DNA, were made in academic laboratories built and supported almost entirely by funds from the public sector. Support by industry and philanthropy can be helpful and catalytic, but cannot be sustained and substantial enough.
For 30 years, my research on the biosynthesis of the building blocks of nucleic acids, their assembly in DNA replication and the training of more than 100 young scientists, was funded with many millions of dollars from the U.S. National Institutes of Health without any promise or expectation that this research would lead to marketable products or procedures. No industrial organization had, or would ever have, the resources or disposition to invest in such long-range, apparently impractical programs. We carried out these studies to satisfy a need to understand the basic processes in cellular function. Yet to my great pleasure, such studies of the replication, repair and rearrangements of DNA have had many practical benefits.
The enzyme pathways of assembling DNA from its building blocks have provided the targets for the design of most drugs used today in the chemotherapy of cancer, AIDS, herpes and autoimmune diseases. These studies are also crucial to understanding the repair of DNA, so important in the aging process, and for understanding mutations and the origin of some cancers.
It may seem unreasonable and impractical, call it counterintuitive, that we can solve an urgent problem such as a disease by pursuing apparently unrelated questions in basic biology, chemistry, or physics. Yet, the pursuit of understanding the basic facts of nature has proven throughout the history of medical science to be the most practical, the most cost-effective route to successful drugs and devices.
Investigations that seemed totally irrelevant to any practical objective have yielded most of the major discoveries of medicine: X-rays, MRI, penicillin, polio vaccine. In the biochemistry department at Stanford, where recombinant DNA was discovered in 1972, we never anticipated the awesome biotechnologies of automated genome sequencing or computer-based bioinformatics. The discoveries on which these technologies were developed came from the pursuit of basic questions in physics, chemistry and biology, unrelated at the outset to a specific medical or practical problem.
Fax machines were invented 50 years ago, but it took a deteriorated postal service among other factors to make them the necessities they are today. I was surprised to learn that in the patent application for lasers in 1949, the only claim the inventors made for the laser was to erase errors in typing. There are similar examples from agriculture and industry. They contradict the common saying: "Necessity is the mother of invention." In fact, time and again, invention is the mother of our necessities.
There are many positive features of academic scientists and engineers participating as consultants and even as partners in biotech ventures. Companies provide a conduit for the vast knowledge generated in academia. The growing industry in turn provides more jobs to employ academic graduates and produce devices at lower prices.
But companies are not in business to do research and acquire knowledge for its own sake. Rather, they are in research to turn a profit. They possess neither the mandate nor the tradition to advance scholarship. Biotechnology companies must, instead, prove their profitability in the ebb and flow of financial markets and focus on short-term goals. Litigation in biotechnology has itself become a significant industry.
We cannot let the money-changers dominate our temples of science. Rather, let us understand the nature of the creative process and provide for its support. No matter how counterintuitive it may seem, basic research remains the lifeline of practical advances in medicine; pioneering inventions are the source of industrial strength. The future is not predicted; it is invented.
Nobel laureate Dr. Arthur Kornberg is professor emeritus at Stanford University's school of medicine. This commentary is excerpted from his presentation at the Gairdner Awards, which recognize medical scientists whose work has significantly improved the quality of human life. Established in 1957, Gairdners have been awarded to 256 scientists, of whom 54 (including Dr. Kornberg) have also won the Nobel Prize.
The views expressed are those of the author and not necessarily those of CAUT.