InvestorsHub Logo
Followers 8
Posts 648
Boards Moderated 0
Alias Born 09/13/2002

Re: chipguy post# 71344

Sunday, 11/16/2008 11:59:14 PM

Sunday, November 16, 2008 11:59:14 PM

Post# of 151805
Burned Once, Intel Prepares New Chip Fortified by Constant Tests


http://www.nytimes.com/2008/11/17/technology/companies/17chip.html?_r=1&oref=slogin&pagewanted=print

November 17, 2008

Burned Once, Intel Prepares New Chip Fortified by Constant Tests

By JOHN MARKOFF

HILLSBORO, Ore. — Rows and rows of computers in Intel’s labs here relentlessly torture-tested the company’s new microprocessor for months on end.

But on a recent tour of the labs, John Barton, an Intel vice president in charge of product testing, acknowledged that he was still feeling anxious about the possibility of a last-minute, show-stopping flaw.

After all, even the slightest design error in the chip could end up being a billion-dollar mistake.

“I’m not sleeping well yet,” Mr. Barton said.

Intel’s Core i7 microprocessor, code-named Nehalem, which goes on sale Monday, has already received glowing technical reviews. But it is impossible for Mr. Barton to predict exactly how the chip will function in thousands of computers running tens of thousands of programs.

The design and testing of an advanced microprocessor chip is among the most complex of all human endeavors. To ensure that its products are as error-free as possible, Intel, based in Santa Clara, Calif., now spends a half-billion dollars annually in its factories around the world, testing the chips for more than a year before selling them.

There is good reason for the caution. In 1994, the giant chip maker was humbled by a tiny error in the floating point calculation unit of its Pentium chips. The flaw, which led to an embarrassing recall, prompted a wrenching cultural shift at the company, which had minimized the testing requirements of the Pentium.

A series of bugs last year in the Barcelona microprocessor from Intel’s main competitor, Advanced Micro Devices, was equally devastating.

A.M.D., based in Sunnyvale, Calif., had been making steady progress, offering new processor technologies long before Intel and handily winning the power-efficiency war. But the quality problems that slammed A.M.D. cost the company revenue for several quarters and landed it in a hole from which it has yet to dig out.

If Nehalem is a hit for Intel, it will represent vindication for Andrew Grove, the company’s former chief, who acknowledged that he had been blindsided by the Pentium problems and then set out to reform the company.

The Pentium bug badly damaged Intel’s brand with consumers. The company quickly became a laughingstock as jokes made the rounds of the Internet: Q: Know how the Republicans can cut taxes and pay the deficit at the same time? A: Their spreadsheet runs on a Pentium computer.

After initially appearing to stonewall, Intel reversed course and issued an apology while setting aside $420 million to pay for the recall.

The company put Mr. Grove’s celebrated remark about the situation on key chains: “Bad companies are destroyed by crisis. Good companies survive them. Great companies are improved by them.”

Those words weigh heavily on the shoulders of Mr. Barton and his colleagues — as does the pressure from Intel’s customers around the world whose very survival is based on the ability to create new products with the company’s chips at their heart. Nehalem is initially aimed at desktop computers, but the company hopes it will eventually be found in everything from powerful servers to laptops.

“Our business model is now predicated on saying to the consumer, ‘You will get a new set of functionality by a particular date,’ ” Mr. Barton said. “We did get a new dimension of business pressure that says we can’t take our merry time turning it out whenever we feel like it.”

The pressure for a successful product is especially intense now as the overall technology industry faces a serious slump. Intel’s chief executive, Paul S. Otellini, said last month that the company was getting “mixed signals” from customers about future spending. Intel’s stock fell 7.7 percent on Friday to $13.32, a six-year low, in a broad market drop.

With Nehalem, Intel’s designers took the company’s previous generation of chips and added a host of features, each of which adds complexity and raises the possibility of unpredictable interactions.

“Now we are hitting systemic complexity,” said Aart de Geus, chief executive of Synopsys, a Silicon Valley developer of chip design tools. “Things that came from different angles that used to be independent have become interdependent.”

Trying to define the complexity that Mr. Barton and his team face is itself a challenge. Even in the late 1970s, chips were being designed that were as complicated as the street map of a large city.

Mr. Barton’s love affair with the world of electronics began as a child, when he took apart a walkie-talkie his father had given him and counted its transistors: a total of seven. The change in his lifetime, he said, has been “mind-boggling.”

Going from the Intel 8088 — the processor used in the IBM PC 27 years ago — to the Nehalem involves a jump from 29,000 transistors to 731 million, on a silicon chip roughly the same size.

Mr. Barton equates the two by comparing a city the size of Ithaca, N.Y., to the continent of Europe. “Ithaca is quite complex in its own right if you think of all that goes on,” he said. “If we scale up the population to 730 million, we come to Europe as about the right size. Now take Europe and shrink it until it all fits in about the same land mass as Ithaca.”

Even given a lifetime, it would be impossible to test more than the smallest fraction of the total possible “states” that the Nehalem chip can be programmed in, which are easily more plentiful than all the atoms in the universe.

Modern designers combat complexity by turning to modular design techniques, making it possible to simplify drastically what needs to be tested.

“Instead of testing for every possible case, you break up the problem into smaller pieces,” said G. Dan Hutcheson, chief executive of VLSI Research, a semiconductor consulting firm.

After the Pentium flaw, Intel also fundamentally rethought the way it designed its processors, trying to increase the chance that its chips would be error-free even before testing. During the late 1990s it turned to a group of mathematical theoreticians in the computer science field who had developed advanced techniques for evaluating hardware and software, known as formal methods.

“For several years Intel hired everyone in the world in formal methods,” said Pat Lincoln, director of SRI International’s Computer Science Laboratory.

The Intel designers have also done something else to help defend against the errors that will inevitably sneak into the chip. Nehalem contains a significant amount of software that can be changed even after the microprocessor leaves the factory. That gives the designers a huge safety net.

It is one that Mr. Barton and his team are hoping they will not have to use.

Volume:
Day Range:
Bid:
Ask:
Last Trade Time:
Total Trades:
  • 1D
  • 1M
  • 3M
  • 6M
  • 1Y
  • 5Y
Recent INTC News