From R&d to r&D

Posted by Peter Cochrane on December 1, 2003

Peter Cochrane

Only five years ago investment in R&D was considered a healthy essential for the longevity of industry and society. No matter where you went in the world you would find huge investments realising a steady flow of new and exciting technologies.

The preceding decades had witnessed microwave radio, microelectronics, optical fibre, lasers, cellular radio, the PC, the internet, and almost all the seminal technologies and applications on which we now rely. So what happened?

Just mention research investment today and it is like revealing you have a dose of the plague. The focus is now 100 per cent development - picking winners is the only game in town, and launching products assured of success and a good Return On Investment is the primary gamble. So how did we get here, and can we be assured of a bright future without fundamental research?

It might seem paradoxical that the vast majority of R&D organisations associated with IT have been drastically reduced or closed down when the last year has seen more new mobile phone offerings than all the products delivered by the entire electronics industry throughout 1950s. The telecom and IT sector is turbo charged to satisfy an insatiable customer base by turning out more products year-on-year, at the same price or less, with ever more facilities, better performance, greater service offerings…. and rapidly narrowing profit margins. The point is that the IT and telecoms sector has matured and its products have become commoditised. Moreover, while we have accumulated a vast knowledge and the ability to create more and more products year on year, the extrapolation is mostly linear: more of what we already have, faster, smaller and cheaper, but without any fundamentally new innovations. In 1984, for example, it was possible to install optical fibre to the home at a lower cost than copper, by 1987 coherent optical systems were working in the field, and by 1990 optical wireless systems were almost the equal of the established microwave base. But most of these innovations have yet to reach the market.

So if we have technologies 'good to go' in the lab, why can't we get them straight to the market place? Two fundamental time delays seem to be in the way. The first concerns the individual and the purchase of all white and brown goods, the average time it takes for the product to break down, and the average time between failure and replacement. Early adopters spend huge amounts of money up front, whilst laggards wait to buy at the lowest prices. Roughly speaking, the spread across the entire consumer base results in the magic number ~10 years. To be specific: it took 12 years for the mobile phone to eclipse the fixed line telephone; eight years for the colour TV to eclipse black and white; 11 years for the digital camera to overtake the analogue; and nine years for the commercial internet to become established as a dominant force. The washing machine, freezer and microwave, on the other hand, seem to be on a ~5 year change cycle.

The second time delay is much longer and concerns the commercial sector with its multi billion dollar investments in infrastructures of networks, switches, routers, and storage systems. These tend to have longer time constants ~20 years. When capital outlay is measured in billions of dollars and has to be written down and replaced on the back of telephone calls and services, the considerations that become dominant in the delay equation are design, operational, engineering, and financial.

About five years ago it became apparent that the R&D community had got well ahead of what the market could subsume. Moreover, in the case of telecoms, some revelations were unpalatable to the copper mindsets of an industry with a 100-year corporate history. It wasn't just the arrival of the PC wiping out the mainframe and the internet becoming a threat to the phone companies, it was far more brutal. It was the sudden realisation that networks and networking companies had a lesser role in the future, and that the number of people employed was going to have to fall by ~70% to maintain profitability. Even worse, it became axiomatic that the services and facility companies living at the edge of networks were set to prosper and would most likely widen their margins.

In short, telecommunications and IT were on the same track to marginalisation already travelled by the food and travel industries. This was not a happy prospect for organisations with long and profitable histories, and in such an environment R&D tends to go to the wall - fast.

At the same time a new social paradigm was becoming evident. If people could install, look after and maintain for themselves a washing machine, tumble drier, microwave, hi-fi, radio, TV, and PC, why was it that something much simpler like a phone line and networking could not also be merged into the home and office infrastructure at a much lower cost?

Today wired and wireless home and office networking has arrived, along with the predominance of the mobile phone, and all under the control and say-so of the customer. Network centricity is being wound down and more is being done on the network edge as society becomes more versed and adapted to technology. So how much R&D do you actually require for such a world, which seems to have moved from a predominant hardware bias to a software and services orientation?

The answer has to be - not a lot. Certainly the sort of money spent on R&D in the 1970's and 1980's to support revolutionary technologies for telecommunications and IT is unlikely to be required again for some time. The world has moved on and the centre of gravity and focus now lies at the interface of biology, machines and people.

This new transition, powered by the ever-growing ability of our computers and networks, will be bigger than the Industrial and IT Revolutions that went before. It is a move away from dumb to smart materials. A move from forging, casting, stamping out, drilling and milling, etching and construction, to a world of programmable material and things. Decoding the genome was only possible because we had the raw computing power; decoding proteins will be the same but even bigger and more heroic. All the insights we have gained from biological and bulk material processes lead us to nano-technologies and programmable materials and devices. As a result it seems that much more R&D in future will be economically realised across start-ups, universities and much smaller laboratory regimes than hitherto.

A principal, and largely unrecognised, reason for the dramatic change in the R&D landscape is therefore the power of the technology itself. I well recall that my PhD took me a full 3 years of very hard work using a slide rule. Today I could do it in a mere 3 months with my PC and net connection. This seems to me to be a key element of the change that we now face. In many areas we no longer require legions of human beings to do our R&D, we can do it with far fewer people empowered by legions of computers and networks.

So what of the future? As an ex R&D manager and director I do not feel downhearted by the current situation, quite the reverse. I feel excited and elated. In some respects the old R&D laboratories failed to see what was coming and failed to change ahead of time. Their closure was brought about by a combination of growing management and market ignorance that could not see the power that R&D had brought them in the past, and could not imagine what the future might be. In the event of that failure of realisation we will no doubt see a continued culling of R&D across successive sectors. But on the upside, from the cullings new companies exploiting rafts of new technology have sprung and will continue to spring. Some of these technologies were visible 15 - 20 years ago, but the vast majority have yet to be discovered.

In conclusion bulk research will be replaced by distributed research, perhaps until the time comes again when we need massively focused resources. That time is not yet.

Peter Cochrane was head of research at BT until 2000 when he left to join ConceptLabs, a company he had co-founded in 1998 with a group from the Apple Advanced Technology Group in Silicon Valley.