People generally view progress in science and technology with awe. But as I discussed in my last column, much more progress could be made if research itself were guided by scientific principles. This week, I will build off that idea by looking at how policymakers can align resources more effectively by avoiding the expectation gap that naturally widens with each new discovery.
Every scientific insight opens the floodgates to another, larger set of questions to explore. Over time, the number of unanswered questions increasingly outpaces the amount of resources available to study them. But that doesn’t stop scientists from trying as the promise of new horizons opens up before them. Making a conscious effort to avoid the expectation of being able to discover all of the world’s mysteries will make for more sensible research decisions and funnel resources toward the most relevant findings.
Astronomically Large Numbers Are Difficult to Grasp
About 40 years ago, a rather popular documentary illustrated the scale of the cosmos. It began with a couple picnicking in the park before slowly zooming out to reveal the entire visible universe. It then steadily zoomed back in, down to the level of subatomic quarks. Earthlings who watched the film were struck by the mind-warping emptiness of space, which was nothing like their day-to-day experiences. If a similar documentary could explain the size of numbers, I imagine we would get some feel for how otherworldly very large numbers seem to the human brain.
The languages of hunter-gatherers typically include words for “one,” “two,” “three” and “many” for all higher numbers. The agricultural revolution dramatically expanded this set of words as farmers found ways to keep track of the size of their crops or the number of animals in their herds. People began using the phalanges in their fingers as counting aids in what would become the basis of the sexagesimal system (a numerical system with a base of 60). This system, passed down from the ancient Sumerians, is still used today to measure time and angles. Meanwhile, the earliest hieroglyphics documented a method of counting that uses full fingers, rather than just the phalanges, to allow for a numerical system with a base of 10 that eventually developed into today’s metric system.
Over time, these agrarian cultures developed an understanding of numbers that was sophisticated enough to forge famous legends based on mathematics. The story of the man who invented chess comes to mind. An ancient king asked the inventor what reward would be fitting for such a marvelous invention. After some prodding, the inventor said he wanted one grain of wheat for the first square of the chessboard, two for the next square, and so on, doubling the number of grains for each square until all of the squares were filled. The king readily accepted such a seemingly low price, thus displaying man’s innate lack of understanding of the very large numbers this mathematical progression would produce by the 64th square. (The grains of wheat would pile as high as Mount Everest, equaling the global annual wheat production a thousand times over.)
Despite these age-old lessons, large numbers continue to catch today’s key opinion leaders and decision-makers just as off-guard as this legendary king. Let me give a few examples.
Mechanical Physics
A simple example is the revolving door. Germany issued the world’s first patent for a revolving door in 1881. Since then, the mechanical structure of revolving doors has remained relatively unchanged, and only a few variants are available on the market. But a few years ago, a Dutch designer used a combinatorial method — a mathematical approach used to determine possible combinations to ensure an optimal design — to create a new design that would allow people with shopping carts or using wheelchairs to pass straight through. By combining the different options for just seven of the defining elements that make up a revolving door, the designer generated more than 10,000 possible designs. Of those, he made more than 200 into two-dimensional and three-dimensional animated models. From there, 30 of the designs proved to be patentable inventions.
This case raises a number of interesting points. First, the limited amount of manpower allocated for the study made it necessary to reduce the thousands of possibilities under consideration to just a few hundred that could be effectively analyzed. But it was not the systematic application of scientific methods designed for the problem at hand that steered this drastic pruning; instead, it was the designer’s own insight and experience. Some call this “the art of design,” but it is difficult not to note the lack of evidence behind it.
Another question to consider is: What if the study had expanded its scope to include options for more than just seven of the revolving door’s elements, potentially including the choice of materials, the design of the hinges, the roller bearings and flaps, and so on? The resources that would be required for what originally seemed to be a relatively simple device would be mind-boggling. Wouldn’t it then be useful to learn, understand and either validate or invalidate the process the designer used to cut his initial 10,000 options down to 200 before trying to narrow down an even larger set of possibilities?
As you can imagine, the problem only grows with more complex machines such as aircraft. From the Wright brothers’ early stick-and-sail gliders to last year’s hollow, arrow-shaped WU-14 scramjet, aviation has covered a broad range of applications and has reached a high level of industrial maturity. Aircraft designers between the two world wars and after the surge of composite materials in the 1970s generated the most exotic-looking configurations. And yet modern commercial jets — the mainstay of the aviation industry — all seem to have converged on roughly the same model, despite numerous areas for improvement. Why?
The development of the Boeing 777 — the first of the modern commercial jet models to be fully assembled in virtual space prior to construction — tells a tale that is remarkably similar to that of the revolving door. The three-dimensional database used to build the model contained more than 3 million components. Combining even a portion of those components with different options for materials, development costs and risk and maintenance considerations would create a staggering number of potential designs that only a handful of corporations would have the resources to sift through.
Helicopters pose a similar problem, though their designers face the added challenge of serving a market that is only a fraction of the general aviation market, increasing the competitive pressure. The few companies that can manage the additional intricacies of rotary physics often patent several variants as quickly as possible to block competitors from developing them first. Meanwhile, open-source activists try to get ahead of corporations by making any hint of innovation public in the hope of establishing sufficient prior art, thus obstructing any path to a patent and effectively curtailing any prospect for commercial success. Both tactics cut off avenues for possible improvement in helicopter design.
The Human Genome
Similar flaws exist in genomic research. Thirty years ago, the Holy Grail of genetics — a complete map of the human genome — seemed within reach, and scientists heralded it as the next giant leap for mankind. Some 20 years and $3 billion later, two teams, one public and one private, declared the mission accomplished. And yet the expected revolution in diagnosis, prevention and treatment of most human diseases has not materialized.
The reasoning used to motivate the project — and its funding — was that DNA forms the biological blueprint of life, and that being able to “read” it would unlock the mysteries of how living beings work. Even within this simplistic model, the sheer amount of data produced by sequencing DNA pushed the limits of bioinformatics and stretched resources to the brink. The rate at which scientists are sequencing genomes is more than doubling each year, and the data waiting to be processed is overrunning even the remarkable increases in computing performance.
Scientists have now acknowledged that even though we know the human genetic code, we don’t really comprehend how it works. There are plenty of bits and pieces of the puzzle that have become clear, but we are only now beginning to realize the immense amount of work ahead of us in our quest for a deeper understanding of the complex interactions between the DNA components and cellular layers that make up a living being. Again, this represents the problem of big numbers: Scientists would need to explore the thousands of different proteins that can be spliced and reassembled into many different combinations, as well as the myriad factors that can affect genetic expression, including the conditions of the womb, stress at birth and the development of individual microbiomes on the skin and in the gastrointestinal tract. Combine these variants with the alternatives that arise in experimentation and the testing of new therapies, and we reach an astronomical number of options that are too unwieldy to test one by one.
An explanation for each human disease and disorder exists somewhere within this vast array of data. We now know more about some diseases with simpler processes such as Huntington’s chorea. A silver bullet cure for common chronic illnesses such as diabetes, cardiovascular diseases and most cancers remains elusive. At the time of the Human Genome Project’s inception, enough information was known to anticipate the staggering number of possibilities that scientists would encounter and to adopt more modest and realistic goals accordingly. Had policymakers done so, perhaps they would have devised an approach that was more productive than a high-profile race between public and private institutions. Given all that we don’t know and likely will not know for some time, it is discouraging to see that, today, occasional articles triumphantly announce the discovery of the gene for obesity or Alzheimer’s, knowing that a cure is likely further from our reach than we realize.
Finding a Boulder of the Right Size
Decision-makers dealing with all manners of societal problems are naturally floored by the complexities of most of the issues before them, whether they are aware of it or not. Science policymaking is no exception to this rule, but the nature of science itself would seem to make it more amenable to objective and scientific assessment. By being aware of our limits in understanding the immense numbers that inevitably crop up in science policy decisions, we should be able to at least partially protect ourselves from choices based on unrealistic expectations that lead to waste and become hard to defend. The image of the Mount Everest-sized pile of grain on the chessboard offers some inspiration: We must choose a boulder that is sized just right, allowing us to push it to realistic heights.
“Rolling a Boulder Up Mount Everest is republished with permission of Stratfor.”