‘Huge costs in development mean that chip designers must get it absolutely right the very first time’
The romantic notion that small is beautiful in manufacturing is passé, especially in the rapidly evolving world of semiconductors. The sheer economics of chip designing and manufacturing, exemplified by huge and rapidly increasing capital costs has shut the field to small players seeking to compete with the likes of a Samsung, Intel or the Taiwan Semiconductor Manufacturing Company, the three top global semiconductor companies.
The compacting of chip sizes, driven by the inexorable stretching of Moore’s Law, which postulated long ago that the number of transistors on a microprocessor would double every two years, has had a dramatic impact on the industry.
Shrinking chip sizes
Not only have the size of individual chips shrunk dramatically, they increasingly house entire systems now, what is referred to as a system on a chip (SOC). The number of transistors on a device — defined as transistor count in industry parlance — has increased from 2,300 on the Intel 4004 in 1971 to 5 billion transistors on the 62-core Intel Xeon Phi processor that was released last year.
Meanwhile, developments in material science and new photolithography techniques, which are used to etch complex integrated circuits on super thin surfaces of the chips, have enabled the dramatic reduction of individual sizes of chips. All this has, of course been done, while concurrently expanding the capability of the chip. Individual chips, which used to be of 10 microns in 1972, are now down to 22 nanometres (a nanometre being 1,000 times smaller than a micron).
The fabrication process has become so sophisticated and so expensive that only a handful of the major semiconductor companies in the world are able to access capital to establish facilities to design and manufacture chips, says Ronald D. Black, president and CEO of Rambus, an American company with a market capitalisation of over $1 billion that started as a memory solutions provider, but which later ventured into other areas such as LED lights.
‘Rise in risk’
“The scale of manufacturing is only one aspect of the problem. An even more serious issue is the staggering rise in the risk involved,” observes Dr. Black. Moreover, non-wafer costs, especially those related to the cost of photomasks, which define the steps in the lithographic process of fabrication, have increased sharply. “But, above all, huge costs in development mean that chip designers must get it absolutely right the very first time. The heavy costs involved simply allow no margin of error.” “Mistakes will not cost a couple of hundred dollars but tens of millions of dollars, and may even bankrupt a company.”
While the rising cost of development has resulted in a form of extreme concentration in the industry, companies seek to reinforce their clout by using their hold over the intellectual property to ring fence their status and thwart the competition. The average number of IP blocks on an SoC, for instance, has increased from around 15 around a decade ago to about 90 in 2012, says Dr. Black.
In every decade since the 1960s, a cluster of technologies — consumer as well as industry — has driven the global semiconductor industry (see graphic). Looking to the future, he asks whether the Internet of Things (IoT) will drive growth in the next decade. The explosive growth of Big Data, driven by the plummeting cost of storage, while being a prime driver, also poses challenges, he warns.
However, he warns that IoT also has to come to terms with the “incredibly complex ecosystems” that generate Big Data. Delivery of services in an IoT schema will be challenging because their design will hinge on foolproof connectivity and “hardened” applications that perform on a mission critical mode all the time (you would not want to reboot a system in your car, for instance, he jokes). In all this, cost is not merely a critical element, but in many cases will decide whether they can be used at all in some systems (for instance, you would not be able to rig a $20 modem to a 10-cent sensor and believe that it will sell, he observes).
The rather pessimistic outlook for a semiconductor startup painted by Dr. Black was not taken to very kindly when he addressed leaders of Indian companies at a meeting organised by the Indian Electronics and Semiconductor Association (IESA) recently. He pointed out that not very long ago, a company that made an investment of $50 million could expect to generate revenues soon. Now, the minimum level of investment would be of the order of $250 million. This was the prime reason why venture capital funding for semiconductor startups had dried up.
In contrast to software, in which Indian companies can use labour cost arbitrage as a driver of growth, in the semiconductor business, labour costs are just too trivial to provide an advantage to Indian companies. Even having a low-cost semiconductor design team is only a small advantage because it is much more important “to get things right the first time, every time,” he says.
Dr. Black has some interesting thoughts based on a comparison of the industry in India and China. Although China is also at the lower end of the curve in both semiconductor designing and in software development, its “strong linkages” with manufacturing, based on its “State-run capitalism” lay the basis for “significant progress” within a short period of time.
The key to success in semiconductor designing and manufacturing in India lies in the development of a viable ecosystem, he says. But the crucial element is “a strong local demand”, an obvious reference to India’s heavy dependence on imports. “It would be a mistake,” he warns, “to just use India as a low-cost outsourcing centre.”