Founder Editor in Chief: Octavian-Dragomir Jora ISSN (print) 2537 - 2610
,
ISSN (online) 2558 - 8206
Contact Editorial Team PATRON The Idea
Innovation’s Feet of Clay

Innovation’s Feet of Clay Recent studies show that technological advancement is becoming harder

Constant technological improvement is the fixed idea of the modern world, spoken of as a self-evident truth or an axiom in what has become the closest thing to religious revelation that many people get to experience. Not only is technology improving but, in many ways, it is also accelerating, leading to concepts such as Ray Kurzweil’s Singularity “within our lifetimes”, when computers become smarter than people and technological development becomes unmoored from human inspiration or effort. This might be the case and, in truth, ideas to the contrary have been proven wrong ever since the apocryphal episode of the Commissioner of the US Patent Office, Charles Duell, suggesting its closure on account of everything having already been invented by 1899. However, evidence is mounting that the rate of true scientific and technological advancement has only been maintained through the allocation of greater and greater resources, both financial and human. 

Celebrity Economist Tyler Cowen expressed misgivings about the “illusion of permanence” for technological development (as a subset of economic growth) in his pessimistic book “The Great Stagnation: How America Ate All the Low-Hanging Fruit of Modern History, Got Sick, and Will( Eventually) Feel Better”. Through its very title, he also hinted at a possible source for the anticipated slowdown of development – the theory that marginal costs for technological development keep increasing after the various “low hanging fruit” had been plucked by past generations. 

Regardless of the eventual truth of the matter, it is undeniable that, in comparison with past ages, the gentleman scientist and inventor changing the world from his laboratory or workshop has been replaced by large and well-funded scientist collectives who spend more and more of their lives trying to become up-to-date in their fields or to specialize down narrow paths. Ultimately, the persistence of the idea of uninterrupted technological development as a force of nature, independent of external circumstances save annihilation, is based not only on the experience of recent decades, but also on an innate pessimism regarding human nature. This pessimism views technology, its potential scale and its cold rationality as a more likely benefactor of mankind than fractious political forces, social and economic systems prone to abuse or dysfunction, or some innate striving of man veering uncomfortably close to the “will to power”. It is also the latest adaptation of the centuries old “Whig theory of history”, whereby, despite occasional stumbles, the course of humanity is always heading towards progress, justice, freedom and development. 

 

Background growth 

When people look to China in admiration for its astounding rate of growth, they tend to forget that it is catch-up growth fueled not only by the existence of ready-made capital markets and export markets, but also by over a hundred years of accumulated technology and managerial techniques accessible to it as never before in history (let us remember that the secrets of silk and porcelain had to be smuggled out of China). For much of its history, the US, the most recent engine for growth and innovation, has grown at a steady yearly pace. Over that period, the number of researchers and the funding allocated for R&D has also increased, suggesting a continuously lowering of productivity in the rate of production for new ideas. The technological distance between 1900 and 1960 in the West is a lot larger than between 1960 and 2010 or 2016. Incremental innovation has taken place and new fields have bloomed, like personal computing, but the basics of the car, or of a kitchen, have remained the same and would be recognizable to a man from the 1960s urban US in a way that a man from 1900s might not experience regarding the 1960s. 

The US and all technological “prime movers” must also accept the cost of evolutionary dead ends, such as the “current wars” between Edison and Tesla, which saw two different electricity infrastructure types being built. They must also absorb the cost of not having amortized the investment in the wide dissemination of previous technologies by the time a new technology renders them obsolete and demands some new infrastructure. Where the principal economic actors are not forced to swallow the losses or are not expected to profit as much from the new development, they will delay the new introductions as much as possible until they have fully amortized their prior investments or even until the profitability of often captive markets has fallen. This is how the US, the country that created the Internet as we know it today and is responsible for much of its advancement, has only modest download speeds and high costs attributable to the persistence of prior infrastructure, based on copper wiring. Countries that were late to the Revolution could skip directly to fiber optics, especially by not having low US population densities and long distances. Only now, with disruptive business models like Google Fiber and the involvement of state authorities, have the telecommunications dinosaurs improved their products. 

As China moves up the value chain and reaches the technological frontier, it will find its growth rate slowing and having to bear a lot of the costs for new developments. This is why industrial espionage is a reality even/especially between developed countries, like the US, Israel and France, which do not lack native idea production centers. 

Less for more 

Charles Jones of Stanford argued for a dynamic of diminishing returns obfuscated by concurrent increases in resource allocation in a paper titled “R&D based models of economic growth”[1]. The paper’s abstract states that: 

This paper argues that the “scale effects” prediction of many recent R&D based models of growth is inconsistent with the time-series evidence from industrialized economies… Although growth in the extended model is generated endogenously through R&D, the long-run growth rate depends only on parameters that are usually taken to be exogenous.  

Recently, he returned to the subject in “Are Ideas Getting Harder to Find?”, alongside Bloom, Van Reenen and Webb[2]. The principal subject of this article is this paper, whose abstract reads: 

In  many  growth  models,  economic  growth  arises  from  people  creating  ideas, and the long-run growth rate is the product of two terms:  the effective number of researchers and the research productivity of these people. We present a wide range of  evidence  from  various  industries,  products,  and  firms  showing  that  research effort is rising substantially while research productivity is declining sharply. A good example is Moore’s Law.  The number of researchers required today to achieve the

famous  doubling  every  two  years  of  the  density  of  computer  chips  is  more  than 25 times larger than the number required in the early 1970s.  Across a broad range of  cases  and  levels  of  disaggregation,  we  find  that  ideas  —  and  in  particular  the exponential growth they imply — are getting harder and harder to find. Exponential growth results from the large increases in research effort that offset its declining productivity. 

Economic growth is a function of invested resources in R&D (simplified, possibly fatally for the validity of the paper’s argument, as number of researchers dedicated to the activity) and the “idea total factor productivity” of the inputs, which is declining. The overall trend for the US over almost a century seems to be the following: 

The authors write: 

“Since the 1930s, research effort has risen by a factor of 23 — an average growth rate of 4.3 percent per year. Idea TFP has fallen by an even larger amount — by a factor of 48 (or at an average growth rate of -5.3 percent per year).” 

For example, the famous Moore’s Law is predicated on a doubling of the number of transistors on a silicon chip every two years, leading to ever higher computational capabilities. Incidentally, the advancement has required silicon wafers of ever higher purity and production quality, which, in a Trumpian twist of fate, has led to a Japanese duopoly, ShinEtsu and Sumco, which keeps the entire global electronics industry running. Monsanto of the United States and Wacker of Germany were once valid contenders in this market of geopolitical significance, but have retired from it when faced with the increasing costs of maintaining a technological edge. Bloom, Jones, Van Reenen, and Webb (2016) show that the great players in the industry have had to allocate more and more researchers to R&D to keep Moore’s Law operational for longer. The chart below basically shows an increase of around 25 times since 1970 in the number of researchers assigned to the field as a proxy for actual R&D spending, which involves cost with facilities, materials, experimentation and testing.

For better or worse, Moore’s Law has been working as planned. Other areas have faced declining returns in spite of higher investment. With world population set to increase tremendously in Africa, among other places (to 4.4 billion peoples in 2100, compared to 1.3 billion today and 0.25 billion in 1950), the decline in agricultural yield growth rates is a worrying trend, since it presages either food insecurity or more extensive use of land (clearing forests and habitats) and resources like water and chemical fertilizers. 

The area of medicine also shows substantial issues with total factor productivity. The chart below refers to new drugs (new molecular entities – NME) approved by the Food and Drug Administration (FDA) of the United States. While all countries have their own regulator, FDA approval is a general gold standard for new medicine, because, in another Trumpian twist of fate, the American market is the most profitable drug market in the world. Given the exorbitant costs of creating a new drug, FDA approval is the single largest roadblock to it becoming a success and the firm recouping its investment and reaping handsome profits in the future, as part of its intellectual property portfolio. The United States is also the principal spender on pharmaceutical R&D, both nominal and per capita, edging out the European Union whose various National authorities running National health systems with bulk buyers negotiate steep discount rates for drugs, making the American market even more important. The jaggedness of the blue line refers to the role of FDA as final decision makers, meaning that the rollout of new drugs is not smoothed by project completion date, but by the standards and the whims of the regulators.

The cost burden associated with research and regulatory compliance has grown so large that a new business model has emerged, where firms are created and funded by investors to research a particular drug and then win FDA approval for the specific goal of being bought by a large pharmaceutical company to bring the product to market and protect its IP. The ones that fail go bankrupt and the cost of the failed enterprise is borne by venture capitalists and other investors, while the large pharma companies are protected from such shocks (whose repetition would destroy them) and get to pay (a great deal) only for the winner. 

Scannel, Blanckley, Boldon and Warrington discussed some of these economic issues in “Diagnosing the decline in pharmaceutical R&D efficiency”[3], whose abstract states:

 The past 60 years have seen huge advances in many of the scientific, technological and managerial factors that should tend to raise the efficiency of commercial drug research and development (R&D). Yet the number of new drugs approved per billion US dollars spent on R&D has halved roughly every 9 years since 1950, falling around 80-fold in inflation-adjusted terms. There have been many proposed solutions to the problem of declining R&D efficiency. However, their apparent lack of impact so far and the contrast between improving inputs and declining output in terms of the number of new drugs make it sensible to ask whether the underlying problems have been correctly diagnosed. Here, we discuss four factors that we consider to be primary causes, which we call the ‘better than the Beatles’ problem; the ‘cautious regulator’ problem; the ‘throw money at it’ tendency; and the ‘basic research–brute force’ bias. Our aim is to provoke a more systematic analysis of the causes of the decline in R&D efficiency.

The charts below show the evolution of basic factors related to the effectiveness of cancer research. 5 year survival rates and number of years saved are the desired outputs per thousand individuals, while publications and clinical trials are used as inputs. The authors commented that: 

For figure 10 - “The 5-year mortality rate conditional on being diagnosed with either type of cancer shows an S-shaped decline since 1975. This translates into a hump-shaped “Years of life saved per 1000 people” […] as the mortality rate first rises and then slows. For example, for all cancers, the years of life saved series peaks around 1990 at more than 100 years of life saved per 1000 people before declining to around 60 years in the 2000s.” 

For figure 11 - “Total publications for all cancers increased by a factor of 3.5 between 1975 and 2006 (the years for which we’ll be able to compute idea TFP), while publications restricted to clinical trials increased by a factor of 17.1 during this same period. A similar pattern is seen for breast cancer research.” 

For figure 12 – “The hump-shape present in the years-of-life-saved measure carries over here. Idea TFP rises until the mid-1980s and then falls. Overall, between 1975 and 2006, idea TFP for all cancers declines by a factor of 1.2 using all publications and a factor of 5.9 using clinical trials. The declines for breast cancer are even larger…” 

The authors conclude that “just to sustain constant growth in GDP per person, the U.S. must double the amount of research effort searching for new ideas every 13 years to offset the increased difficulty of finding new ideas.” 

The findings of the paper also argue against the standard assumptions in many academic growth models, which include constant idea total factor productivity. Finally, “ideas are different from all other goods in that they do not get depleted when used by more and more people.” 

Along with the provocative premise of the paper, this final idea is the most important, as it argues not just in favor of a qualitative leap in ideas as a source of higher living standards and greater per capita growth, but also in favor of a quantitative leap, whereby mechanisms are used to ensure the widespread dissemination of the benefits of new ideas while keeping in mind the necessity of rewarding investment in advancing the knowledge frontiers. As science fiction author William Gibson once wrote: 

“The future is here; it’s just not evenly distributed” 

Keeping in mind that there are still vast populations in the world that do not have the benefit of constant electricity or indoor plumbing, which are developments going back more than a hundred years in the West, at least in urban areas, there is room for vast improvement in living standards, so long as it is done sustainably. Simply handing over the required infrastructure, as has happened in the past with decolonization, may only result in its destruction. A series of tools by which the beneficiaries are incentivized to use them, maintain them and enhance them responsibly are required, whether through micro-lending, education, assured markets and so on. Otherwise, as many policies today are likely to do, we may not only fail in ensuring a convergence of welfare, but also end up killing the goose that lays the golden eggs. 

Drawing the conclusions 

The paper argues its point well, of diminishing returns to scientific investment – “For example, between 1985 and 2006, declining idea TFP means that the number of years of life saved per 100,000 people in the population by each publication of a clinical trial declined from more than 8 years to just over one year. For breast cancer, the changes are even starker: from around 16 years per clinical trial in the mid-1980s to less than one year by 2006.” 

It does not purport to give an absolute explanation, and its own findings suggest heterogeneity of the production function of new ideas – “Next, however, notice that the changes were not monotonic if we go back to 1975. Between 1975 and the mid-1980s, idea TFP for these two medical research categories increased quite substantially. The production function for new ideas is obviously complicated and heterogeneous. These cases suggest that it may get easier to find new ideas at first before getting harder, at least in some areas.” 

“Eureka” moments are still a valid and necessary tool for advancement, but a series of substantial preconditions must be met – pre-investment in the required fixed capital, trained researchers at the frontier of knowledge (meaning that researchers advancing the cutting edge are older and older compared to previous times) and a willingness to fund research to its uncertain conclusion. It is very possible that the decline measured by the authors corresponds to a model of “peaks and valleys”. Great advances that change the paradigm completely would be the peaks, while the slopes of the valleys represent the decline in idea total factor productivity as the “low hanging fruit” gets picked and wringing more performance out of a mature idea becomes harder. The decline in Idea TFP for cancer research, for instance, might be predicated mostly on finding better uses of the same underlying concepts for treating cancer. However, once the entire field shifts, with the development of accessible gene editing techniques, the dynamic might be reversed, as an avalanche of new ideas pours forth, made possible by the new underlying technology. The advent of the microchip was a transformative phenomenon for the world of computing, making possible everything else. But speculative research is being done on using another material instead of silicone as the underlying medium for chip manufacturing. This might lead to a reset of Moore’s Law and the beginning of a new cycle. 

 

One should not neglect the value of lateral development, which is what has effectively been going on in Silicon Valley for the past decade, outside of some important corporate labs. Finding new applications starting from existing “ideas” (technologies) can create significant wealth and utility for consumers. The development of better satellites might have slowed or experienced increases in required investment in key areas such as the onboard atomic clocks, sensors or the underlying models, but the revolution in the use of GPS or Earth Observation is just now underway, with applications ranging from the “Internet of Things” to precision agriculture. Lateral development also takes an “idea”, in the sense promoted by the authors, and turns it into another idea, bypassing a significant number of steps. This is sometimes related to the idea of cross-fertilization of domains, or simply improving efficiencies. Regarding medicine, the drug Sildenafil (better known by the brand name Viagra) was used to treat cardiovascular problems, until its potential for treating erectile dysfunctions was noticed. The drug Finasteride (better known by the brand name Propecia) was used to treat prostate afflictions, including cancer, but had side effects related to stopping and reversing hair loss. 

National considerations 

These findings should be analyzed for their relevance in the context of present and future Romanian research. The tenor of the paper makes it clear that certain forms of fundamental research, if they are to be done effectively, are beyond the sustainable reach of a country with Romania’s resources. Our active engagement in modern research, to the extent that it produces noticeable advancements that translate into wealth, should probably be confined to niches where resources can be concentrated until the result is achieved. These niches may be identified according to the existing skillsets and inspiration of the researcher base, or the actual requirements of the Romanian companies seeking to be competitive abroad. Romanian research has had its “eureka” moments and these, once they have taken place, are worth pursuing to the utmost. But, for the most part, it should be considered that, regardless of National pride, it might behoove Romania to focus on incremental improvements and new applications that aid it in the process of catch-up growth. At the same time, there is an argument to be made for research not just with a definite result in mind, but also for the process of engagement of a host of brilliant minds, which might have notable side results. It could even become a matter of policy to conduct research to prevent the immigration of high IQ people with the potential for research, to slow down the diminishment of the general faculties of the population under the dysgenic selection performed by “brain drains”. The American space and weapons research programs (as well as parts of academia), when not engaged in practical pursuits of immediate value, have been humorously and facetiously described as a jobs program for people with very high IQs and low social skills to retain them in the middle class and in stable families. 

There is an element of exaggeration, but also a kernel of truth to this claim, one that a country in Romania’s position should consider, especially since the paper at hand leads us to another conclusion – that in order to keep growing based on deteriorating “Idea TFP”, the inevitable shortages of research personnel must be solved by first cannibalizing other high IQ fields within the same discipline, secondly by cannibalizing other high IQ sectors of the economy by better competing for manpower, and thirdly, by competing with other countries for their high IQ populations, a competition that Romania is destined to lose, absent significant policy shifts. One can express doubts about the rhetoric and historical interpretation of exploitation of colonies for resources, but it seems more and more likely that strip mining the developing world of its brain matter is the 21st century (and the worst) form of exploitation. Gradual increases in wealth and complexity compound the issues, as the greatest sources of high IQ specialists are those places which, starting from an adequately gifted population, are advanced enough to identify and train them, but unable to use their capacities to the fullest. Ideally, a model should be pursued whereby the global flow of ideas, not researchers, is maximized, which adds another positive dimension to the increased market size and wealth in developing countries leading to an increase in the number of researchers. If Bloom et al. are correct, then, absent serendipitous development, future growth hinges on being able to exploit the world’s full brain power and to not waste minds on account of malnutrition, disease, lack of education or the impact of degenerate social models and incentive structures[4]

Finally, one would be remiss not to bring the issue of ethics to the fore. An important ingredient might be missing from our current scientific endeavors which compound the discrepancy between resources invested and results. As Physicist Richard Feynman pointed out in his “Cargo cult science” speech[5], science requires a significant level of personal integrity for the scientist to remain true to the principles of the scientific method. One wonders whether that level of integrity is still the norm in many scientific fields. For instance, the social sciences are undergoing a crisis of replication[6], wherein reported results for prior experiments are not being repeated for subsequent studies. Many previously announced findings, especially in politically sensitive fields, are being invalidated in this way, after they have, perhaps, done what they were supposed to do. Maybe this is why the stuffy old Victorians and their immediate ideological descendants were the best practitioners of the scientific method. The downward trend of the productivity of resources invested into research may definitely have a correlation, if not some degree of causation, with the downward trend of ethics in research, as induced by the “publish or perish” paradigm or the mechanics of financing and hierarchy in academia and governmental structures.

 

[1] http://web.stanford.edu/~chadj/JonesJPE95.pdf

[2] http://www-leland.stanford.edu/~chadj/IdeaPF.pdf

[3] http://www.nature.com/nrd/journal/v11/n3/full/nrd3681.html

[4] Like the already well documented trend of people talented in Math and Physics going into high finance to engage in zero-sum competitions of who best designs impenetrable and complex financial instruments.

[5] http://calteches.library.caltech.edu/51/2/CargoCult.htm

[6] http://takimag.com/article/the_replication_crisis_and_the_repetition_crisis_steve_sailer/print#axzz4Uk96YSX0

 
FIRST EDITION

SUBSCRIPTION

FOUNDATIONS
The Market For Ideas Association

The Romanian-American Foundation for the Promotion of Education and Culture (RAFPEC)
THE NETWORK
WISEWIDEWEB
OEconomica

Amfiteatru Economic