SIGHTINGS



20th Century Technology -
Both Hero And Villain
http://www.insidechina.com
8-20-99
 
PARIS (AFP) - White advanced a pawn to c4. Black threw up his hands and resigned. It was the end of an era. Man had taken on the microchip and lost.
 
The man in question, world chess champion Gary Kasparov, stomped off in a fit of very human pique.
 
The microchip, embedded in an IBM computer called Deep Blue, refrained from histrionics. Its victory, the triumph of artificial intelligence over grey matter, marked the day -- May 11, 1997 -- that information technology came of age.
 
Deep Blue, we were reminded, couldnt write like Shakespeare, dance like Nureyev or act like Olivier. Small consolation. Frankenstein had finally been superseded by his monster.
 
The pace of technological change in this century has been breath-taking.
 
When Max Planck was sitting at his kitchen table in Berlin in 1900, putting the finishing touches to his epoch-making quantum theory, the only computer in the house was himself.
 
As he turned round to reach for a snack, there was no refrigerator to open, or any other electrical appliance for that matter.
 
There were no airplanes overhead, no cars on the streets. The thought processes which inspired the young Albert Einstein were unhindered by radios or record-players, telephones or televisions.
 
A year later, three little pips heralded the communications revolution that would change all that.
 
"I placed the single earphone to my ear and started listening," recalled Gugliemo Marconi in 1902. "I heard faintly, but distinctly, pip-pip-pip."
 
The pips, the letter "S" in Morse code, had travelled across the Atlantic to St. John's, Newfoundland, from Cornwall, England and wireless communication was about to change the world.
 
Henry Ford, inventor of the first petrol-driven car in 1893, confirmed his status as America's greatest inventor in 1913 by setting up the first moving assembly line. Mass production was born, enabling us to get our hands on as many Model-T Fords, baked beans or lap-top computers as our pockets allowed.
 
In 1926 John Logie Baird, forced by ill-health to sit at home and tinker, produced the first flickering images of what became another of the century's defining inventions: television.
 
The world wars were the catalyst for huge technological advances.
 
After Orville Wright piloted the Kitty Hawk bi-plane for a historic 12-second flight in 1903, he accepted US army backing to found the Wright Aeroplane Company six years later.
 
World War II in particular provided ammunition both to those who saw technological advance as the key to progress and those who feared the capacity for ever greater tragedy -- what H.G. Wells saw as "a race between education and catastrophe."
 
Already in 1942 Orville Wright warned against "the use of a beneficial invention for diabolical purposes." What he feared came about, and the dropping of atomic bombs on Hiroshima and Nagasaki in 1945 opened up a moral debate which is still raging.
 
Mathematician Alan Turing also believed he was working to end war when he helped create "Colossus", the German codebreaking computer.
 
Deep Blue's primitive ancestor was the size of a barn and a Neanderthal compared to todays lap-tops. Nowadays the average American household contains more computing power than existed worldwide only 30 years previously.
 
The space race -- launched by the Russians with the Sputnik satellite in 1957, won conclusively by the United States with the moon landing in 1969 -- brought further innovation: new plastics, new medical techniques, mobile phones and microwave ovens.
 
Perhaps more importantly, it also brought a clearer understanding of the earths fragility, boosting the ecology movement and the push for "clean technology".
 
Also in 1969, communications technology was changed forever when the Pentagon -- again, with military purposes in mind -- inaugurated the internet, a device which has since become as much a part of civilian life as the telephone or the television.
 
In a rapidly-changing world scarred by frequent wars, technology now gives rise to as much apprehension as appreciation, and the perennial row over whether it is to be seen as sinner or saviour has found a new battleground -- the natural sciences.
 
Whether over human cloning, genetically modified tomatoes or the morning-after contraceptive pill, successive breakthroughs give rise to fierce antagonisms.
 
As the writer Arthur C. Clarke put it, "for every expert, there is an equal and opposite expert."
 
At the dawn of a new century, man can look forward -- hopefully or in dread, according to inclination -- to technological advances hardly dreamed of a generation ago.
 
Among the developments said to be lined up (though only time will tell if they get beyond the realm of science fiction) are computer implants to improve memory, holidays on the moon, the cloning of spare body parts, pollution-free energy sources, and life expectancy nudging 100 years.
 
On the other hand, man could destroy the planet at the flick of a switch or smother it gradually with the by-products of technological change.
 
Even if he gets it right, the real challenge, as Harvard entomologist Edward O. Wilson said in 1978, is -- what will he do with it?
 
"It could be that in the next 100 years humankind will thread the needles of technology and politics, solve the energy and materials crises, avert nuclear war, and control reproduction.
 
"The world can at least hope for a stable ecosystem and a well-nourished population. But what then?"






SIGHTINGS HOMEPAGE