Jakob Hansen asked:
I've been pondering about the following; We are often told that we can gain large energy savings by shutting off all those transformers and stand-by devices, such as televisions, computers, printers etc., in our homes that are using energy. Instead we should turn off the devices by the master switch or pull it out of their sockets. However, when a device, e.g. a transformer is using energy that energy must be used for something, and it seems to me that in the end all that energy will be converted into heat.
Now, if my home is heated solely by electricity, electric radiators etc., would I still have any savings by turning off all those stand-by devices? My point is that if the energy used by those devices are eventually turned into heat anyway, it must all contribute to heating my house. Does it make any difference whether that heating comes from devices designed for heating or if it comes from "second hand" energy?
Thanks for a great show,
We posed this question to Cambridge University's David MacKay...
This myth is true for a few people but only during the winter, but itís false for most. If your house is being heated by electricity through ordinary bar fires or blower heaters then yes, itís much the same as heating the house with any electricity wasting appliances. But if you are in this situation, you should change the way you heat your house. Electricity is a high grade energy and heat is low grade energy. Itís a waste to turn electricity into heat.
If you're using electricity to heat the room, and the room is kept at a constant temperature, then it makes no difference; electrical heat is electrical heat.
As I understand it, electric devices all heat with essentially the same efficiency, except as wolfekeeper mentions that you may choose to let the house cool down somewhat at night, or while you are at work. If the outside temperature reaches the inside temperature of the house during the day, your heating system would not be needed, but your electronic devices would still be consuming power. Incandescent light bulbs will also generate heat, but if they are in the ceiling, they may not distribute that heat downward to where you want it.
There's a easy way to find out. If the devices that are in standby mode are appreciably warmer when in standby than when unplugged, they are producing a bit of heat. If you can't tell the difference, it's negligible. Even if there is a bit of heat, don't expect to see a major drop in your electric bill if you unplug them all the time. Compare their heat output with a 40 watt light bulb - it should be much lesss.
Keep in mind, if an item is warmer than ambient temperature, but cooler than body temperature, it may still feel cool to the touch, but could still be radiating heat into the environment.
- I agree with Geezer mostly. Standby power isn't much, they are not very good at transfering heat & in a cool room the rate of heat transfer out the roof would be high.
1W of standby power does add up though.
Turning the question vaguely on its head: Wouldn't it be nice if all the world's electric space-heaters (or water heaters, I guess) were actually micro-servers contributing to the internet and its 'Cloud'! peppercorn, Wed, 25th Jan 2012
Turning them off will do very little to reduce the bill, and will more likely shorten the life of the power supplies inside, as the switch on surges are then a lot more than the designer anticipated. As well consider thermal effects from cycling, and you might be in a lose lose situation. Remember the biggest user of power is resistive/motor heating ( stove, geyser, room heating/airconditioning) and then, a far second, lighting, with intermittent loads ( iron, hair dryer, television, washing machine) taking a variable amount of power, followed by standby loads and always on loads.
If you ran a stirling engine from the gas and used the rejected heat for room heating you would get the best possible power generation, just from making use of the reject heat. Would still use energy to do so, but would at least not just dump it to the outside. Rather like running a greenhouse with hot water from a power plant, it is a good synergy. SeanB, Wed, 25th Jan 2012
Some great discussion here, thanks folks!
The only reason I am suggesting a stirling engine over any other is that they generally can be used with low differences between hot and cold sides. Poor efficiency, but you only need a source at boiling point of water and a sink at ambient to get useful work out of them. If you used one at a large power station on the waste heat from the condenser you would be able to power a large part of the emergency equipment from it, or on a nuclear plant you would have a non fuel method to run pumps until the residual heat has decayed, almost totally independent of any other power source, and almost walk away safe providing you have a large enough water source to use as coolant. SeanB, Fri, 27th Jan 2012