One of the laws of physics (can’t remember which) states that entropy always increases. Put another way, a local decrease in entropy must cause an overall increase. This is how refrigerators work. Lowering an object’s temperature also lowers it’s entropy, however, the heat given off by the coils at the back is increasing the entropy of the surroundings (by raising the air temperature), thus overall entropy goes up. In heat terms, the heat dissipated by the coils will be greater than the heat removed from the contents of the fridge.
The refrigerator analogy works rather well for software. Any time a developer makes a change to some code, its entropy (ie. software entropy, which in this case can be considered ‘disorder’) will tend to increase. In order to maintain the same level of entropy in the code, the developer has to put in more effort than strictly needed to just make the desired change. This effort might be expressed as adding extra tests or refactoring around the changed code. The extra effort takes energy (time + brainpower). To maintain the code’s level of entropy when making changes, the overall entropy must increase – measured in this case by the extra mental effort causing an increase in temperature of the air around the developer’s head.
its the second law of thermodynamics as far as i remember, and it is also one of my favorites for programming, even if its impossible to reduce overall entropy even in refactoring, refactoring code should at least reduce local entropy which goes in direction of kiss …