0 Members and 1 Guest are viewing this topic.
I will dig out the reference to this but Leonard Susskind is quite insistent on two occasions that I have seen that to take a computer hard drive full of information and return it to the initial "unwritten" form will, through change in entropy (ie apart from magnetic, electric, etc considerations), require a release of heat (incredibly small amounts) from the computer. clearly if this is the case (and I realise I have not convinced) then the opposite must apply - that to put information on requires an increase in information and an increase in energy thus mass. Basically the complication and non-randomness on the hard disc when it is written must be maintained, as information is always conserved - that information can only be preserved by a change in the environment.
... It may be a continuous string of zeros, but it's still information. If that's true, when we write different information on to the disk, the net change in entropy is zero.
Entropy is a measure of disorder, or more precisely unpredictability. For example, a series of coin tosses with a fair coin has maximum entropy, since there is no way to predict what will come next. A string of coin tosses with a two-headed coin has zero entropy, since the coin will always come up heads. Most collections of data in the real world lie somewhere in between.