The Naked Scientists

The Naked Scientists Forum

Author Topic: Does it take more energy to cool something than heat it?  (Read 12875 times)

Brian Holden

  • Guest
Brian Holden  asked the Naked Scientists:
   
Dear Chris,

Thank you for your part in an excellent site,

My questions are,

Does it take more total energy to cool, say a house or fridge, by one degree, as it would to heat it by the same one degree?

I can see that there would be efficiency questions and production of the energy used, that would effect the out comes in the 'normal world'; but down at a theoretical/all other things being equal, place, is there some truth to my 'intuition' that cooling takes more energy? 

Regards Brian

What do you think?
« Last Edit: 11/03/2011 02:30:03 by _system »


 

Offline syhprum

  • Neilep Level Member
  • ******
  • Posts: 3825
  • Thanked: 19 times
    • View Profile
Does it take more energy to cool something than heat it?
« Reply #1 on: 11/03/2011 08:02:59 »
To the first order changing the temperature of a body by one degree either heating it or cooling it requires the same amount of energy although in the real world heating equipment is often much more efficient than cooling.
On a more theoretical level the thermal capacity of materiel's varies with temperature to some degree for various complex reasons.
Take a look at this article.

http://en.wikipedia.org/wiki/Heat_capacity

« Last Edit: 11/03/2011 10:10:13 by syhprum »
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Does it take more energy to cool something than heat it?
« Reply #2 on: 11/03/2011 23:17:35 »
"The answer a typical physicist gives to the question, "what is the highest possible temperature?" will depend on their implicit opinion of the completeness of our current set of physical theories. Temperature is a function of the motion of particles. If the speed of light is the universal speed limit, then a gas of maximum temperature may be defined as a gas whose atomic constituents are each moving at the speed of light. The problem is that attaining the speed of light in this universe is impossible; light speed is a quantity that may only be approached asymptotically. The more energy you put into a particle, the closer it gets to moving at light speed, though it never fully approaches it.

At least one scientist has proposed defining the maximum possible temperature as what we would get if we took all the energy in the universe and put it into accelerating the lightest possible particle we could find as closely as possible to the speed of light. If this is true, then discoveries about elementary particles and the size/density of the universe could be relevant to discovering the correct answer to the question this article addresses. If the universe is infinite, there may be no formally defined limit to the maximum possible temperature.

Even though infinite temperature may be possible, it might be impossible to observe, therefore making it irrelevant. Under Einstein's theory of relativity, an object accelerated close to the speed of light gains a tremendous amount of mass. That is why no amount of energy can suffice to accelerate any object, even an elementary particle, to the speed of light - it becomes infinitely massive at the limit."

(It continues to state "If a particle is accelerated to a certain velocity near that of light, it gains enough mass to collapse into a black hole, making it impossible for observers to make statements about its velocity. That is why the Planck temperature is often referred to as the maximum possible temperature"... But that is wrong by the way, well, as far as I know. The correct definition for how a Black hole can come to be is one thing, and one thing alone. ..Compression.. and the 'massiveness' referred too above is the 'relative mass' or 'momentum' of the particle, not that its 'invariant mass' ('real mass', sort of) grows.)

So do a black hole radiate?
No, if we ignore its possible Hawking radiation a black hole should be very cold. Why, because all radiation have only 'one way' to its center. And how did it reach that 'absolute coldness'? By compression turning all matter into 'radiation', and then 'energy'.

So how much 'energy' does it take?
Absolute 'cold', and absolute 'heat'?

Look at a black hole :)
« Last Edit: 11/03/2011 23:56:33 by yor_on »
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Does it take more energy to cool something than heat it?
« Reply #3 on: 11/03/2011 23:43:22 »
So now you might wonder how the LHC is thought to be able to produce those 'mini black holes'. don't they use accelerators to shoot particles very very fast?

"If the centre-of-mass energy of two elementary particles is indeed higher than the Planck scale ED, and their impact parameter b is lower than the Schwarzschild radius RH, a black hole must be produced. If the Planck scale is thus in the TeV range, the 14 TeV centre-of-mass energy of the Large Hadron Collider (LHC) could allow it to become a black-hole factory with a production rate as high as about one per second. Many studies are underway to make a precise evaluation of the cross-section for the creation of black holes via parton collisions, but it appears that the naive geometric approximation σ~πR2H is quite reasonable for setting the orders of magnitude."

But all of this is highly theoretical, and we are now speaking extra dimensions, as well as using the 'energy' released in the collision. Energy is what made that black hole too, as it was compressed past its invariant mass ability to withhold against the 'pressure' then collapsing into radiation and 'energy'.

The case for mini black holes.

And, as I said, highly theoretical  No black holes, but extra time at LHC.
« Last Edit: 11/03/2011 23:53:37 by yor_on »
 

Offline techmind

  • Hero Member
  • *****
  • Posts: 934
  • Un-obfuscated
    • View Profile
    • techmind.org
Does it take more energy to cool something than heat it?
« Reply #4 on: 13/03/2011 12:34:55 »
Brian Holden  asked the Naked Scientists:
...
Does it take more total energy to cool, say a house or fridge, by one degree, as it would to heat it by the same one degree?
...

To answer how much energy is required to cool something requires you to also specify the temperature of the surroundings of the thing you want to cool.

If we are considering real-world scenarios such as houses or refridgerators we also need to know something about the heat-loss/gain (the thermal-insulation) of the thing we're trying to heat/cool. If the thermal insulation is imperfect (there's heat leakage) then it takes energy just to maintain the temperature at a constant level, even before you consider wanting to increase/decrease it...


If we assume perfect insulation, then the temperature of the fridge/room/whatever will remain constant unless we do "something".

The simplest way to heat something (raise its temperature) is to put an electric heating element inside it, and apply some power. 1 watt = 1 joule per second (10 watts equals 10 joules per second etc). If the heat capacity of the material is 1 kJ/kg/K (most things other than water are about that), then if you've got 1kg of 'stuff' then it'll take 1kJ of energy (eg a 20W heater for 50 seconds) to raise the temperature by one degree...  2kJ for 2 degrees etc...

(but this isn't the only, nor necessarily the best, way to heat things)


To cool things below the temperature of their surroundings you need to use what is generically known as a heat pump. This literally uses energy to pump (heat) energy from one place to another. For small temperature differences this can be reasonably efficient, but if you're trying to cool something which is already much cooler than the surroundings, this becomes inherently inefficient. Worst still, practical heat-pumps are often markedly worse than the already-poor theoretical best-efficiency...


Note however that a heat pump can also be used "in reverse" to heat things. In principle this will never be less efficient than a simple heater (though markedly more complicated). It can be much more efficient than a heater when you have a large 'reservoir' of heat energy that is not too much cooler (eg a few 10's of degrees) than the thing you're trying to heat.


So... there isn't a simple answer to the original question!
« Last Edit: 13/03/2011 12:38:54 by techmind »
 

Offline Geezer

  • Neilep Level Member
  • ******
  • Posts: 8328
  • "Vive la résistance!"
    • View Profile
Does it take more energy to cool something than heat it?
« Reply #5 on: 14/03/2011 04:25:33 »
Theory says that cooling has to consume more energy.

It is possible to heat a substance using an electric heating element without losing any of the energy applied to the element. Cooling requires the use of some sort of heat pump, and the efficiency of a heat pump is limited according to Carnot's Theorem.
 

The Naked Scientists Forum

Does it take more energy to cool something than heat it?
« Reply #5 on: 14/03/2011 04:25:33 »

 

SMF 2.0.10 | SMF © 2015, Simple Machines
SMFAds for Free Forums