Question of the Week

Would turning off standby devices increase my heating bills?

Sun, 29th Jan 2012

Listen Now    Download as mp3 from the show Are any viruses good for you?


Jakob Hansen asked:



I've been pondering about the following; We are often told that we can gain large energy savings by shutting off all those transformers and stand-by devices, such as televisions, computers, printers etc., in our homes that are using energy. Instead we should turn off the devices by the master switch or pull it out of their sockets. However, when a device, e.g. a transformer is using energy that energy must be used for something, and it seems to me that in the end all that energy will be converted into heat.


Now, if my home is heated solely by electricity, electric radiators etc., would I still have any savings by turning off all those stand-by devices? My point is that if the energy used by those devices are eventually turned into heat anyway, it must all contribute to heating my house. Does it make any difference whether that heating comes from devices designed for heating or if it comes from "second hand" energy?


Thanks for a great show,



We posed this question to Cambridge University's David MacKay...

This myth is true for a few people but only during the winter, but itís false for most.  If your house is being heated by electricity through ordinary bar fires or blower heaters then yes, itís much the same as heating the house with anyelectric heater electricity wasting appliances.  But if you are in this situation, you should change the way you heat your house.  Electricity is a high grade energy and heat is low grade energy.  Itís a waste to turn electricity into heat. 

To be precise, if you make only one unit of heat from a unit of electricity, then that's a waste.  Heat as called air source heat pumps or ground source heat pumps can do much better, delivering 3 or 4 units of heat for every unit of electricity consumed.  They work like a back to front refrigerator, pumping heat into your house from the outside air. 

For the rest of us whose homes are heated by fossil fuels or by biofuels, itís a good idea to avoid using electrical gadgets as a heat source for your home.  At least for as long as our increases in electricity demand are served by fossil fuels.  Itís better to burn the fossil fuel at home.  The point is, if you use electricity from an ordinary fossil power station, more than half of that energy from the fossil fuel goes sadly up the cooling tower.  Of the energy that gets turned into electricity, about 8% is lost in the transmission system.  If however, you burn that fossil fuel in your home, then more of the energy goes directly into making hot air for you.


Subscribe Free

Related Content


Make a comment

If you're using electricity to heat the room, and the room is kept at a constant temperature, then it makes no difference; electrical heat is electrical heat.

But if the room is allowed to cool sometimes but the device is still generating heat, then obviously that's likely to be a waste.

There is a trend towards insulating houses very well, and then the waste heat generated by your normal appliances, cooking, washing etc. contributes quite a bit towards heating your house, and passive solar heating is used as well, and materials with high heat capacity are used to help even out the daily temperature variations, and then you may not even need a heating system, maybe just some additional portable heaters when the weather is exceptionally cold. wolfekeeper, Mon, 23rd Jan 2012

As I understand it, electric devices all heat with essentially the same efficiency, except as wolfekeeper mentions that you may choose to let the house cool down somewhat at night, or while you are at work.  If the outside temperature reaches the inside temperature of the house during the day, your heating system would not be needed, but your electronic devices would still be consuming power.  Incandescent light bulbs will also generate heat, but if they are in the ceiling, they may not distribute that heat downward to where you want it.

If you have a heat pump, then it is more efficient than other electric appliances, provided it has access to a heat exchange source at a moderate temperature, and thus your electronic devices would be less efficient heating.

Obviously excess heat is not needed, and often not wanted during the summer. CliffordK, Mon, 23rd Jan 2012

There's a easy way to find out. If the devices that are in standby mode are appreciably warmer when in standby than when unplugged, they are producing a bit of heat. If you can't tell the difference, it's negligible. Even if there is a bit of heat, don't expect to see a major drop in your electric bill if you unplug them all the time. Compare their heat output with a 40 watt light bulb - it should be much lesss.

Transformers don't draw power if they are not supplying power to a load - at least, not in theory. In practice they are not 100% efficient, so they do get warm on "no-load". If they are really hot, they are either defective, or not very good transformers.

However, you won't find a lot of traditional large transformers in power supplies these days. They have been largely replaced by small high frequency versions. Geezer, Mon, 23rd Jan 2012

Keep in mind, if an item is warmer than ambient temperature, but cooler than body temperature, it may still feel cool to the touch, but could still be radiating heat into the environment.

Metals with high heat conduction often feel colder than they actually are. CliffordK, Mon, 23rd Jan 2012

- I agree with Geezer mostly. Standby power isn't much, they are not very good at transfering heat & in a cool room the rate of heat transfer out the roof would be high.

LOW POWER - gut feeling is vampire power is nowhere near in practice as some old reports say. FCC energy star rating means mosts products produce minimal electricity on standby; less than a watt. You can see by unplugging them all in the middle of the day for 15 mins and seeing how much difference it makes to the KWh usage on your meter (when the fridge, freezer, and electric hot water etc.  are not kicking in). An hour of a small tumble drier is like having a 40W light on for a day, which is like having a few things on standby for a month. Small things can up, but they nothing like a few big things added together.
POOR HEAT TRANSFER - I think there is a complication in comparing the mode transfer of heating,  there must be a reason why heaters are not designed to be like scaled up versions of a phone charger ie. a transformer encapsulated in thick plastic. Remember a heater transfers heat in the form of conduction, convection, and radiation. If you sit in front of a coal fire you'll receive direct infrared radiation, convections will transfer some of the heat from  heated air to your lungs etc. and perhaps there will be conduction of heat from the fire surround and walls etc, which will also end up as convected heating air. I would have thought most of the heat from standby devices is in the transformer would be more wasteful than an electric heater cos they wouldn't transfer heat to the human body very well. When you get large density of computers in a small  place like the old server farms then it definitely is enough to heat the air up and heat the whole room.
RAPID RATE OF HEAT TRANSFER OUT - I would have thought most UK buildings are still incredibly badly insulated, if they were you'd need very little heat at all as once heated they'd stay warm, so the talk of heat from lightbulbs going up and out the roof seems right. There would be something to do with the rate of heat loss of the room as well : In a hot room there comes a point where heat cannot be explelled much faster, but in a room with no other devices I would think the rate of heat transfer from the standby device into the air  is just about the same as rate of heat transfer out of the air through  the roof .
- that's all off the top my head I didn't crunch the numbers. stewgreen, Tue, 24th Jan 2012

1W of standby power does add up though.

TV (how many)
DVD Player
Digital-Analog Converter or Cable Appliance
Radio (not counting clock-radios)
Do you leave your laser printer on standby...  that is probably more than 1W.
Computer (again, more than 1W)
Even if you shut off the computer, most of the new switches are electronic, not the old toggle switches on the back... so a bit of background power consumption.
DSL or Cable Modem
Computer Routers/Switches
I found I had a dimmer switch in my house that apparently went to low-dim instead of off.
Cordless Phone (yes...  it is nice, but uses more power than the old rotary phones).
A few Wallworts for stuff that is turned off, near zero power, but not truly zero.

Some people have their TVs on for background noise when a radio would be more appropriate.

So, just counting up, even at 1W per appliance on standby, one gets a minimum of about 15W of standby power, used all day long...  and a lot more if you don't turn off your computer & laser printers, or leave the TV on all day long.

Maybe I need to get another kill-a-watt, or make a power loop for my inductance ammeter, and check individual appliances a bit more.

Of course, if one is in one room of the house...  turn the lights off in the other rooms!!! CliffordK, Tue, 24th Jan 2012

Compared to a radiant heater that is true and probably also when compared with a fan assisted heater. Compared to a heater that heats up the air in the room by natural convection, it won't much difference. The heat still has to get out of the device, and almost all of that is by conduction to the air (which will result in some convection).

If none of the heat could escape from the device it would ultimately overheat, and something would probably ignite. Geezer, Tue, 24th Jan 2012

Turning the question vaguely on its head: Wouldn't it be nice if all the world's electric space-heaters (or water heaters, I guess) were actually micro-servers contributing to the internet and its 'Cloud'! peppercorn, Wed, 25th Jan 2012

Turning them off will do very little to reduce the bill, and will more likely shorten the life of the power supplies inside, as the switch on surges are then a lot more than the designer anticipated. As well consider thermal effects from cycling, and you might be in a lose lose situation. Remember the biggest user of power is resistive/motor heating ( stove, geyser, room heating/airconditioning) and then, a far second, lighting, with intermittent loads ( iron, hair dryer, television, washing machine) taking a variable amount of power, followed by standby loads and always on loads.

Biggest saving is insulating heaters, using gas for cooking and keeping fridge coils clean along with the door seals. Turning off standby loads will be lost in the noise of your varying use of appliances. SeanB, Wed, 25th Jan 2012

Your cloud computing certainly would suffer with people turning their space heaters on and off all the time!!!!!!  And, then what about during the summer?

I was going to set up a personal server at home using an older Dell server...  What a racket it made!!!!!

There was a discussion elsewhere about multiple use out of gas...  running a heat pump or generator with essentially the "free power" from the burning of the gas for heating. CliffordK, Wed, 25th Jan 2012

If you ran a stirling engine from the gas and used the rejected heat for room heating you would get the best possible power generation, just from making use of the reject heat. Would still use energy to do so, but would at least not just dump it to the outside. Rather like running a greenhouse with hot water from a power plant, it is a good synergy. SeanB, Wed, 25th Jan 2012

Where are you putting the sterling engine?
An internal combustion engine would use the expansion energy.
The sterling engine could follow the internal combustion engine, perhaps sitting between the cold air return and hot air circulation system.

One of the issues with current furnaces/heat exchangers is that people use heat to power exhaust up the chimney.  A fan, perhaps an intake fan would be much more appropriate.  However, using an internal combustion engine would also force the movement of gases up the chimney, so one could essentially recover 100% of the heat out of the flue gas and still expel it up the chimney.  In fact, if one then fed the flue gas into a heat pump, one could actually release flue gas at colder than room temperatures.

I can envision a pretty complicated system for a commercial installation, but if done right, one might get back a couple times the amount of energy that one would otherwise get from just burning the fuel to generate heat. CliffordK, Wed, 25th Jan 2012

Not without breaking the second law of thermodynamics you won't.

The energy in the fuel is the amount of heat it can produce. Geezer, Thu, 26th Jan 2012

I suppose you're right.
If I get a 1gph Generator

It burns diesel for about 129,500 BTU or 37.95 KW.

At 75% load, it generates about 10KW electricity.

So, where does that energy come from?  Heat?

If I burn the Diesel in open air, do I get 38KW heat?
And burning it in the engine, I only get 28KW heat, and 10KW electricity (including exhaust heat and engine cooling)?

I suppose that must be true.  I just never thought about it that way. 

So, the advantage would be then to run a heat pump.
28KW Heat + 10KW --> Heat Pump --> More heat assuming a good source of low quality thermal energy.  One can also extract much more heat from the exhaust than normally would be done by using the heat pump. CliffordK, Thu, 26th Jan 2012

That's it -

Energy In = Work Out + Heat Out

The efficiency of an IC engine is the ratio of Work Out to Energy In.

Strangely enough, if you buy gasoline and use it to heat your home, it will cost you about the same as heating your home with electric resitive heating, although you will do much better if you use the electricity to do work in a heat pump instead. Geezer, Thu, 26th Jan 2012

Some great discussion here, thanks folks!

The other thing to consider is what happens in summer - if you rely on air conditioning rather than just opening windows, will you be paying extra to get rid of the heat produced by these devices? For how much of the year do we need the heating on? BRValsler, Thu, 26th Jan 2012

Even if you don't use AC, one doesn't need the extra heat in the summer, and perhaps not even in the spring and fall.
I'm sorry if I'm getting a bit off topic.  But, I'm still thinking about efficiency 

I am still a bit mixed on this...  Replying to Geezer's reminder of no free lunch.

Logically trying to capture "free energy" doesn't work.

Yet, if you burn a fuel outside of an internal combustion engine, the gases will still expand.  One just doesn't get work back from the thermal expansion.  So, I'm not convinced that the internal combustion engine actually decreases energy from the system.

Likewise, I suggested situating a Sterling engine between the cold air return and hot air out.  I would imagine that it would in effect cool the hot air for the system.  But, it would also warm up the cold air return, so in effect, one might not loose a significant amount of energy.  Perhaps the effect of warming the cold air return would be raising the temperature of the flue gases, but that would also be dependent on system design.

Now, say one is building a steam radiator system.

Inserting a steam turbine between the boiler and the steam radiators would mean that one would have to raise the temperature and pressures in the boiler, thus using more energy in the boiler, so that part is more costly.  However, one would be able to efficiently utilize the waste heat left over from the turbine, so perhaps all isn't lost.

Is raising the pressures in the internal combustion engine similar to what one finds with increasing pressures by adding a steam turbine?

It just seems strange.
Perhaps I could simulate the system on a small scale sometime. CliffordK, Thu, 26th Jan 2012

It's more expensive to get rid of waste heat.

When you use a heat pump to heat a building, you can take advantage of any inefficiency and use that heat just like resistive heating. (I'm not sure they really work that way, but they should.)

When you use a heat pump to cool a building, you have to dissipate the heat produced by any inefficiencies in the system to the atmosphere. Geezer, Fri, 27th Jan 2012

The only reason I am suggesting a stirling engine over any other is that they generally can be used with low differences between hot and cold sides. Poor efficiency, but you only need a source at boiling point of water and a sink at ambient to get useful work out of them. If you used one at a large power station on the waste heat from the condenser you would be able to power a large part of the emergency equipment from it, or on a nuclear plant you would have a non fuel method to run pumps until the residual heat has decayed, almost totally independent of any other power source, and almost walk away safe providing you have a large enough water source to use as coolant. SeanB, Fri, 27th Jan 2012

See the whole discussion | Make a comment

Not working please enable javascript
Powered by UKfast
Genetics Society