Sunday, July 08, 2007

entropy

entropy

"entropy." Webster's Third New International Dictionary, Unabridged. Merriam-Webster, 2002. http://unabridged.merriam-webster.com (20 Aug. 2007):

4 : the ultimate state reached in the degradation of the matter and energy of the universe : state of inert uniformity of component elements : absence of form, pattern, hierarchy, or differentiation entropy -- David Bidney> <entropy is the general trend of the universe toward death and disorder -- J.R.Newman>



Merriam-Webster's Collegiate Dictionary:

entropy

Main Entry: en·tro·py
Pronunciation: en-tr-p
Function: noun
Inflected Form(s): plural -pies
Etymology: International Scientific Vocabulary 2en- + Greek trop change, literally, turn, from trepein to turn
Date: 1875
1 : a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder, that is a property of the system's state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system; broadly : the degree of disorder or uncertainty in a system
2 a : the degradation of the matter and energy in the universe to an ultimate state of inert uniformity b : a process of degradation or running down or a trend to disorder
3 : CHAOS, DISORGANIZATION, RANDOMNESS
- en·tro·pic \en-tr-pik, -trä-pik\ adjective
- en·tro·pi·cal·ly \-pi-k(-)l\ adverb



Source: The Free On-line Dictionary of Computing (2003-OCT-10)

entropy

A measure of the disorder of a system. Systems tend
to go from a state of order (low entropy) to a state of
maximum disorder (high entropy).

The entropy of a system is related to the amount of
information it contains. A highly ordered system can be
described using fewer bits of information than a disordered
one. For example, a string containing one million "0"s can be
described using run-length encoding as [("0", 1000000)]
whereas a string of random symbols (e.g. bits, or characters)
will be much harder, if not impossible, to compress in this
way.

Shannon's formula gives the entropy H(M) of a message M in
bits:

H(M) = -log2 p(M)

Where p(M) is the probability of message M.

(1998-11-23)

No comments: