InvestorsHub Logo
icon url

wbmw

08/11/04 1:53 PM

#41937 RE: blauboad #41936

Re: wouldn't leakage increase the harder the processor is worked? You seemed to imply that it was a constant.

Leakage comes from current going through the transistors when they are in the off position. So no, it does not increase the harder a processor is worked.

It does, however, vary between different die in the same process. In fact, you can have two die right next to one another on a wafer with different leakage values. And, on a process where leakage as an average is much higher, you will see a larger variance between parts. My Prescott may run 20W cooler than your Prescott, taken as an extreme. As far as the spec sheet goes, however, Intel has to list the worst case values, and so does AMD. It might be one Prescott in a million that actually dissipates 115W under load. Given that, you'll find that the higher binning (frequency) chips are also the lowest leakage chips. Intel or AMD can trade low leakage for higher frequency and end up with similar power characteristics. That's also why you no longer see different TDP values at different speeds, because you can no longer simply use the P=CV^2f formula; it is no longer accurate now that leakage is a larger contributer.

Anyway, most of the above is tangential to your question, but hopefully it helps.
icon url

chipguy

08/11/04 5:02 PM

#41952 RE: blauboad #41936

Understood, but wouldn't leakage increase the harder the processor is worked?

Only indirectly - most leakage components increase
exponentially with junction temperature. If an MPU
dissipates more power with the same cooling system
its operating temperature will be higher so leakage
will be higher. But if the cooling system Theta[JA]
is improved proportionately with higher power does
not mean higher leakage.