If you’ve ever wondered what temperature your processor should be running at, consider the following: computer processors are primarily made of semiconductors, so there’s a fairly narrow temperature range at which they work well. Unlike metal conductors, they require some heat to work correctly—but just like metal conductors, too much heat can destroy them. You need to find just the right balance.
How Semiconductors Use Heat
All home and office electronics use simple metal wires (usually copper) to transmit electricity because many common metals transmit electricity very reliably and some metals (such as copper) are fairly cheap. But your computer processor does more than just transmit electricity, it also does very basic math using just zeroes and ones. To perform that math, your computer processor has to turn electrical connections on and off.
Because metal transmits electricity reliably, your computer would have a very difficult time turning metal electrical connections on and off. But semiconductors are less reliable than metal, that’s why they’re called semiconductors—they only conduct electricity in certain circumstances.
If your computer can control the circumstances in which a semiconductor works, it can turn that semiconductor connection on and off quite easily. Different semiconductors require different circumstances, but the majority of semiconductors in your computer processor are activated or deactivated by heat. The colder they get, the less electricity they transmit; the hotter they get, the more electricity they transmit.
A 1 GHz computer processor is designed to perform 1 billion operations each second by managing the heat of the semiconductors in your computer. When the processor wants to turn a switch on, it heats it up using a bit of spare electricity until it starts transmitting electricity. When the processor wants to turn a switch off, it stops sending spare electricity to it so that it cools down and turns off.
How Semiconductors Break When It’s Too Hot Or Too Cold
A typical semiconductor in your computer processor may turn on at 150 degrees Celsius (C), which is roughly 300 degrees Fahrenheit (F). If the temperature of the insulator around the semiconductor goes above 150C (300F), the insulator can’t cool down to its off temperature and your computer processor will completely stop working.
In real life, most computers include temperature sensors and a little bit of code which tells them to immediately shutdown without warning if your computer processor gets above a particular temperature, such as 135C (275F). If this code didn’t work, then some semiconductors in your computer processor would get hotter than others first and you would start to get weird problems which would probably end in a Windows error (blue screen of death).
What about your computer processor getting too cold? If the typical semiconductor turns off at 150C (300F), how does it turn on when you boot your computer? (I assume your computer isn’t sitting in a room far above the boiling temperature of water.) The answer is simple: your computer has a tiny built-in hardware chip which initializes your computer processor by heating up part of it to operational temperature and then by running it’s boot-up code.
Once your computer processor starts working, it can keep itself warm enough to operate by using spare electricity from your computer power supply.
How Cold Temperatures Can Benefit Your Computer Processor
As mentioned earlier, a 1 GHz computer processor performs 1 billion operations a second, but that’s not all it’s capable of doing. Intel or AMD or whoever built your computer processor choose that 1 GHz speed based on the temperature of the insulator they thought your computer would use. The insulator for most home computers is air, but some people also use water-cooled computer processors and more exotic cooling chemicals (such as Freon).
The cooler the insulator around your computer processor, the faster a semiconductor can go from on (hot) to off (cold). Whoever built your 1 GHz computer processor assumed that it would take less than 1 billionth of a second for that semiconductor to cool down. But if you use extra cooling, maybe you can get that semiconductor to cool down to its off position in half the time—and that means you can double the speed of your computer processor, a process called overclocking.
Almost no consumer computer processors automatically overclock themselves, and this article doesn’t cover overclocking. Just understand that your computer processor will work great no matter how cold you make it.
So How Hot Can My Computer Get?
The hotter your computer gets, the more time it takes for the semiconductors to cool off. You may wonder how anything can cool down at all during a mere 1 billionth of a second (or less in the case of processors running at 2 GHz, 3 GHz, or faster). The answer is that the semiconductors in your computer are extraordinarily small—so small that they’re measured in nanometers. A five nanometer semiconductor wire is 200 thousandths of a millimeter or about 5 millionths of an inch thick. It’s that thin so that it can cool down quickly.
In addition to its small size, semiconductors are designed out of advanced materials so that they have a very sensitive transition point. That means they may conduct electricity (turn on) at exactly 150C and stop conducting electricity (turn off) at 149.99C, so they don’t need to cool down very much to turn off. Combining small size and sensitive transition points is what lets the semiconductors in your computer processor performing a billion or more operations a second.
Unfortunately, your computer can’t measure the temperature of each semiconductor in your computer directly—that would be too much work. Instead, your computer has a thermometer in one or two places in your computer processor which measures the general temperature of your computer processor. You can access this information in your BIOS boot up screen or using applications for Windows, Mac OSX, and Linux, and it should say that your computer processor is running at less than 100C (212F). (Remember that it doesn’t matter how cold your computer gets—cold is ok for any normal computer processor.)
And while you’re looking at the temperature, consider this: the thermometer in your computer (and any digital device) is itself a semiconductor. The computer runs a small amount of electricity into it; the hotter it gets, the more electricity gets through, so your computer just monitors how much electricity gets through to determine the temperature—all to help you see what temperature your computer should be running at.