InvestorsHub Logo
Followers 7
Posts 2743
Boards Moderated 0
Alias Born 03/29/2001

Re: None

Wednesday, 12/18/2002 5:48:21 PM

Wednesday, December 18, 2002 5:48:21 PM

Post# of 93822
Interesting...The Digital Dilemma
George Gilder, Gilder Technology Report, 12.18.02, 4:56 PM ET

NEW YORK - Why are we bringing up a generation of kids who don't know physics but know everything about Windows? Why are there entire nations, such as India, whose economies are increasingly devoted to this and other totemistic excesses of software? Why has software become the medium through which we deal with the physical world?

Sign up for Forbes' Free Investment Guru Weekly e-mail. We fly airplanes with software; our bombs hunt our enemies with software; we run switches with software whose annual upgrades are the single-largest operating cost in running a network. Across the global economy, we ritualistically do in software functions that could be far better accomplished with applications-specific hardware, the all-optical network is perhaps the supreme example.

The science of application-specific hardware has atrophied in part because every young information scientist is taught that the physical layer doesn't matter to the universal computer. But since the challenges the world gives us are messy, the decision to use a generalized machine to solve them necessarily entails a parallel and ponderous effort to represent the specificity of the world in the machine's terms--the software. Software is proverbially the bottleneck of the information economy--because under the Turing model that's where all the work is done.

And what work it is to represent to that universal computer all the problems of the world, natural and man-made alike, using a language that itself moves ever farther away from any physical primitive, rising above machine language to assembly language to ordinary programming language and thence to the hyper-programming language. Each level, no matter how great and complex the tasks it addresses, masks complexities that must ultimately be resolved on the chip by exploiting the tremendous processing clock-rates to accomplish hugely complex procedures.

The digital crisis is so pervasive we have begun to assume it as part of the background. Six-hundred-thousand bugs in Windows XP from Microsoft (nasdaq: MSFT - news - people ) is a crisis. Winnowing them down to two-hundred-thousand bugs is a crisis that has gone chronic, to be coped with rather than resolved. Windows Home XP has some paradoxical bugs that can't be eliminated without transforming the program. Mega-software has reached some kind of wall, one manifestation of the crisis.

Another manifestation can be seen in Pentium as it moves up toward 60 gigahertz, which Intel (nasdaq: INTC - news - people ) now proposes as a feasible goal. Power increases linearly with clock-rate and exponentially with voltage. Voltage has declined to the point where it generates leakage faster than it relieves power consumption. So there is a real question of whether we can continue to increase the clock-rates that mega-software increasingly demands.

Sacrificing Efficiencies
As hierarchical design, the very process that shielded us from the growing complexities on the surface of the chip, ascends multiple levels of abstraction it becomes impossible to test all the resulting designs in all their possible combinations. So you must incorporate built-in self-test, devoting more and more of the processor to testing itself, and even then you don't test it adequately.

The tests become increasingly tests of interfaces. Since those cannot be fully assured as the chip gets bigger and bigger, you include a lot of redundant cells. The structures for incorporating the redundant cells become themselves increasingly complex. As this process advances, the device becomes increasingly suboptimal. At some point it becomes inferior to using a set of separate chips of a manageable size and modularity--reversing the essential teleology of the integrated circuit. But that doesn't solve the problem; it merely shifts the complications and conflicts to the bus.

At current speeds and densities, the universal clock doesn't work anymore, so you have to have separate clock pulses all across the chip, sacrificing many of the fundamental efficiencies of the digital system. Asynchronous designs are a partial and valuable solution. But in isolation every one of these problems can appear solvable. Taken together they entail a set of fundamentally irresolvable conflicts that suggests the whole digital endeavor is reaching an impasse. The clock problem, the power problem, the leakage problem, the interface problem, the pad-limited problem, the failure of memory technology to keep apace with processor clock-rates, so that most of the clock cycles are wait-states. All these together represent a technology in climacteric.

Moore's Law In Crisis
My colleagues, Dynamic Silicon editors Nick Treddenick and Brion Shimamoto, have been wandering around the office with graphs showing not that Moore's Law is reaching the end of its run, but that its continuation may be irrelevant. They point out that the last four generations of chip geometries, 0.25 microns, 0.18 Microns, 0.13 microns, and now 0.09 microns (90 nanometers) account for only 20% of chips made by the major foundries such as TSMC. The adoption curves for the next cycle of Moore's Law used to be nearly vertical--as soon as we could squeeze more circuits on a chip, everybody wanted the capacity. Today the adoption curves for new technologies are nearly horizontal, even though theoretically the marginal cost to make a 90-nanometer function on a 300-millimeter wafer is less than 20% of the cost of a 130-nanometer chip made on a 200-millimeter wafer.

We spent quite a bit of time in the office recently trying to explain this through the Clayton Christensen overshoot theory (the personal computer already over-serves its real market), through mismatch theory (memories cannot keep up with the processor cycle times), and design complexity (design tools have once again fallen behind the complexities of single-chip electronics). In any case, it seems that the bounty of Moore's Law, which for so long appeared to drive the information industry, is increasingly shunned. Whatever the explanation, the phenomenon tends to confirm the existence of a crisis of digitization.

Excerpted from the November 2002 edition of Gilder Technology Report

Join InvestorsHub

Join the InvestorsHub Community

Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.