Tuesday, July 17, 2007
Punk Rocker
The history of punk rock from the venerable Encyclopedia Britannica. This is part of why I am GenX and not a Baby Boomer.
Sunday, July 08, 2007
Versailles' Hall of Mirrors reopens
Versailles' Hall of Mirrors reopens to the public. Magnificent, historic room was closed for extensive 3-year renovation.
entropy
entropy
"entropy." Webster's Third New International Dictionary, Unabridged. Merriam-Webster, 2002. http://unabridged.merriam-webster.com (20 Aug. 2007):
4 : the ultimate state reached in the degradation of the matter and energy of the universe : state of inert uniformity of component elements : absence of form, pattern, hierarchy, or differentiationentropy -- David Bidney> <entropy is the general trend of the universe toward death and disorder -- J.R.Newman>
Merriam-Webster's Collegiate Dictionary:
entropy
Main Entry: en·tro·py
Pronunciation: en-tr-p
Function: noun
Inflected Form(s): plural -pies
Etymology: International Scientific Vocabulary 2en- + Greek trop change, literally, turn, from trepein to turn
Date: 1875
1 : a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder, that is a property of the system's state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system; broadly : the degree of disorder or uncertainty in a system
2 a : the degradation of the matter and energy in the universe to an ultimate state of inert uniformity b : a process of degradation or running down or a trend to disorder
3 : CHAOS, DISORGANIZATION, RANDOMNESS
- en·tro·pic \en-tr-pik, -trä-pik\ adjective
- en·tro·pi·cal·ly \-pi-k(-)l\ adverb
Source: The Free On-line Dictionary of Computing (2003-OCT-10)
entropy
A measure of the disorder of a system. Systems tend
to go from a state of order (low entropy) to a state of
maximum disorder (high entropy).
The entropy of a system is related to the amount of
information it contains. A highly ordered system can be
described using fewer bits of information than a disordered
one. For example, a string containing one million "0"s can be
described using run-length encoding as [("0", 1000000)]
whereas a string of random symbols (e.g. bits, or characters)
will be much harder, if not impossible, to compress in this
way.
Shannon's formula gives the entropy H(M) of a message M in
bits:
H(M) = -log2 p(M)
Where p(M) is the probability of message M.
(1998-11-23)
"entropy." Webster's Third New International Dictionary, Unabridged. Merriam-Webster, 2002. http://unabridged.merriam-webster.com (20 Aug. 2007):
4 : the ultimate state reached in the degradation of the matter and energy of the universe : state of inert uniformity of component elements : absence of form, pattern, hierarchy, or differentiation
Merriam-Webster's Collegiate Dictionary:
entropy
Main Entry: en·tro·py
Pronunciation: en-tr-p
Function: noun
Inflected Form(s): plural -pies
Etymology: International Scientific Vocabulary 2en- + Greek trop change, literally, turn, from trepein to turn
Date: 1875
1 : a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder, that is a property of the system's state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system; broadly : the degree of disorder or uncertainty in a system
2 a : the degradation of the matter and energy in the universe to an ultimate state of inert uniformity b : a process of degradation or running down or a trend to disorder
3 : CHAOS, DISORGANIZATION, RANDOMNESS
- en·tro·pic \en-tr-pik, -trä-pik\ adjective
- en·tro·pi·cal·ly \-pi-k(-)l\ adverb
Source: The Free On-line Dictionary of Computing (2003-OCT-10)
entropy
A measure of the disorder of a system. Systems tend
to go from a state of order (low entropy) to a state of
maximum disorder (high entropy).
The entropy of a system is related to the amount of
information it contains. A highly ordered system can be
described using fewer bits of information than a disordered
one. For example, a string containing one million "0"s can be
described using run-length encoding as [("0", 1000000)]
whereas a string of random symbols (e.g. bits, or characters)
will be much harder, if not impossible, to compress in this
way.
Shannon's formula gives the entropy H(M) of a message M in
bits:
H(M) = -log2 p(M)
Where p(M) is the probability of message M.
(1998-11-23)
Guess Who's Hiring in America
A recent Business Week article, Guess Who's Hiring in America, (in June 25, 2007 edition, pg 47) notes that Infosys and other Indian companies are cutting costs by hiring more locals in the U.S.
Maybe usually. But not always.
We are in a state of entropy. Theoretically.
"The U.S. hiring by the Indians echoes the strategy Japan's auto industry devised after soaring levels of imports sparked political outcry in Washington in December, 2000. "The Indians are doing to the world's IT processes what the Japanese did to manufacturing," says analyst John McCarthy of Forrester Research Inc (FORR). And now, like Japan's carmakers before them, the Indians are becoming major employers in the U.S. as well."Signs of a new trend starting up. In 1989 a young MBA told me that IBM was defunct as a company and would not exist in five years. In 2002 any number of writers reported that white collar jobs were all going to be offshored. Sometimes we just don't know what the future will hold. We can always count on change to be painful and scary with much collateral damage, and by that I mean nice people suffer when they shouldn't have to, but we can't count on everything always being increasingly horrible.
Maybe usually. But not always.
We are in a state of entropy. Theoretically.
The Changing Reality of Outsourcing
On Linked In "Answers" a member brought up the changing reality of outsourcing recently with the question: Is the Indian outsourcing engineering and IT market now efficiently or overpriced?
Some interesting points were:
"A journalist friend of mine asked me recently about this: he had met a valley company the other day that had laid off its engineers and refocused its engineering back to the valley. Logic given: salaries have risen so high there, that the office and coordination is no longer with the price. My intuition has been that with the masses of companies seeking outsourcing to India that the pricing was going to reach market rates quickly. Not with facts, but with some reasoning and intuition, I have particularly thought that this would be true of small companies and small outfits very soon.It is quite worth reading the responses for the variety of and rapidly changing nature of what is going on in international labor markets. Or global outsourcing and offshoring as these are the current buzz words.
I'm curious for other people's data, analysis, thoughts, and examples of other companies."
Some interesting points were:
- Salaries are growing at 10% to 25% annually in India for software engineers
- There is more demand than supply in India for software engineers
- It is cheaper to hire locally in the U.S. than to bring a person over on a visa
- Only large companies with very large projects are seeing the economic benefits
- Indian companies are outsourcing to other countries such as China to save money
Subscribe to:
Posts (Atom)