- Oppenheimer becomes the destroyer of worlds
- Nobel’s noble end
- The backlash against two backslashes
- Wright was wrong
- Zuckerman, the patron of pop-up ads
- The father of the office cubicle, Bob Propst
- An environment destroyed by one man
Trend watchers are often excited by what they see in the future. But sometimes, a trend promises more harm than good. Elon Musk, for instance, is concerned about the future of humanity when artificial intelligence is ascendant, and Stephen Hawking worries that our attempts to contact alien life may ring the dinner bell for our future galactic overlords. One goal of trend watching, then, is to warn us of coming dangers. Indeed, scientific geniuses are sometimes horrified at the use of their inventions, or at the very least, irritated by the unintended consequences of their accomplishments. From Oppenheimer’s anger at the use of his weapon to Midgley’s unparallelled destruction of the environment, we’re surveying some of the most dangerous (and irritating) inventions in human history and their creators’ reactions to them.
Oppenheimer becomes the destroyer of worlds
In 1941, J. Robert Oppenheimer was a professor of theoretical physics at the University of California, Berkeley, when President Roosevelt decided to undertake the construction of the atomic bomb in a project that would be called the Manhattan Project. Initially chosen to work on the complex calculations necessary to split the atom, by 1942, Oppenheimer was appointed to lead the secret weapon program. Working in total secrecy in an isolated military research facility, Oppenheimer and his team succeeded in detonating the world’s first atomic explosion on July 16, 1945. He later stated that as the blast tore upward through the sky, he was reminded of a verse from the Hindu holy book, the Bhagavad Gita: “I am become Death, the destroyer of worlds.”
Oppenheimer’s fears were realised on August 6th and 9th, 1945, as Hiroshima and Nagasaki were utterly destroyed by his creations. As many as 226,000 people were killed in the blasts, and the radioactive exposure continued to maim and kill for decades. Significantly, he did not regret his work on the Manhattan Project, instead reserving his criticism for how his creation was used as a tool of politics rather than peace. About this, he said, “The ultimatum to Japan [the Potsdam Proclamation demanding Japan’s surrender] was full of pious platitudes. Our government should have acted with more foresight and clarity in telling the world and Japan what the bomb meant.”
Nobel’s noble end
Few realise that the sponsor behind the world’s greatest prize for peace was motivated by his reputation as a “merchant of death.” Alfred Nobel was born to a well-to-do manufacturing family, already a major player in the arms industry. Intelligent and studious, he quickly mastered chemistry and went to work in the family business. His research centred on developing a safe version of nitroglycerine, the notoriously unstable high explosive, and by 1863, he had invented the detonator, and two years later, the blasting cap.
In 1864, a small building used to house the dangerous nitroglycerine exploded, killing five men including his brother, Emil. Dogged by what he considered ‘minor’ accidents, Nobel continued his work to stabilise the high explosive, eventually succeeding in 1867 with the invention of dynamite and again in 1875 with the further refinement, gelignite. The stability and destructive power of these inventions revolutionised warfare, replacing gunpowder as a high explosive.
In 1888, his brother Ludvig died while in France, prompting newspapers there to mistakenly print Alfred’s obituary. One paper remarked that “Dr. Alfred Nobel, who became rich by finding ways to kill more people faster than ever before, died yesterday.” Wifeless and childless, he grew concerned over his legacy and how the world would remember him. Though he did not regret his inventions, he left his vast fortune to fund the Nobel Peace Prize we know today.
The backlash against two backslashes
Sir Timothy John Berners-Lee is an English computer scientist credited with the invention of the World Wide Web. In 1989, while working as an independent contractor for the CERN lab in Switzerland, Berners-Lee designed a system for sharing information among the various researchers on what would become the first web server, CERN HTTPd (the CERN Hypertext Transfer Protocol daemon).
Every time you use the Internet, you should be thanking Sir Timothy. Unfortunately, it’s also him you need to thank for every annoying double backslash that follows the colon in a web address. “I could have designed out the //, made it unnecessary, and that would have saved so many keystrokes since then, but that wouldn’t have been a fundamental change. It would have been a little cosmetic change,” he answered in response to a question about his regrets.
Wright was wrong
On December 17th, 1903, the brothers Orville and Wilbur Wright successfully launched the first heavier-than-air flying machine in Kitty Hawk, North Carolina. After honing their mechanical skills on printing presses, engines, bicycles, and motors, the brothers decided to try their hand at airplanes. While other innovators struggled to design more powerful engines, Orville and Wilbur built a home-made wind tunnel and collected data, soon realising that the key to flight was pilot control.
They devised a system that allowed three-axis control, giving the pilot real command of the aircraft. Indeed, their US patent was not for the flying machine itself, but rather for the system of ‘aerodynamic control.’ By 1909, the Wrights were selling planes to the US military, initially for reconnaissance. As confirmed by their receipt of a French peace prize, the brothers believed that the airplane would end war, making sneak attacks impossible. But as World War I illustrated to Orville, the airplane could be used as a weapon, and aerial attacks soon became a new method of waging war.
Orville later regretted that he had enabled the great powers to extend their aggressive interests. By the end of World War II, he was considerably cynical about the prospect that advanced weaponry could bring peace. On August 28th, 1946, he wrote, “I once thought the aeroplane would end wars. I now wonder whether the aeroplane and the atomic bomb can do it. It seems that ambitious rulers will sacrifice the lives and property of all their people to gain a little personal fame.”
Zuckerman, the patron of pop-up ads
A student of philosophy and ethnomusicology, Ethan Zuckerman is an unlikely digital villain. From 1994 to 1999, he was an employee of Tripod.com, which slowly became a website hosting network. After a long struggle to find a profitable model, Tripod settled on advertising, and when a major source of revenue announced its displeasure at being linked with sexual content, Zuckerman went in search of a solution. His answer was ingenious: de-link the ad from the content by opening a new window. Inadvertently, perhaps, he had created the most annoying feature of the Internet: the pop-up ad. In 2014, Zuckerman came clean: “I wrote the code to launch the window and run an ad in it. I’m sorry. Our intentions were good.”
The father of the office cubicle: Bob Propst
In 1960, Robert Propst was placed in charge of day-to-day activities for the Herman Miller Research Corporation, in Ann Arbor, Michigan. Propst’s project was to study how office furniture was used at the time so that the company could design superior furniture. He studied the office organisation common to the 1960’s; large, open floors housing rows of identical desks, and concluded that the lack of privacy impeded communication. What was needed, he believed, was a mix of privacy and openness that allowed employees to communicate when necessary and work in seclusion when that was better. And counterintuitively, the father of the cubicle farm observed that “one of the regrettable conditions of present day offices is the tendency to provide a formula kind of sameness for everyone.”
His brainchild was the Action Office I in 1964, a modular, easily modifiable system of back-to-back workspaces that could be customised and personalised to suit their users. This included everything from variable height desks to comfortable office chairs. But sales were anaemic. The simple problem was that the furniture was too nice and therefore too expensive as well. The much less costly Action Office II soon followed, forever changing office furniture by giving birth to the cubicle as you now, no doubt, know it.
Tax law allowed corporations to write off their furniture expenses over seven years, encouraging the whole-scale move to cubicles in the 1970s and 80s. For Propst, what started as an exercise to improve the work-life of employees ended in doing just the opposite, and he has come to regret his part in the monotonisation of office life. “The cubiclising of people in modern corporations is monolithic insanity,” he said.
An environment destroyed by one man
For much of the 20th century, internal combustion engines suffered ‘knocking.’ The result of ignition of the fuel while the cylinders are in the incorrect position, knocking ranged from a loud annoyance to a destructive menace. Thus, in 1921, while working for General Motors to solve this problem, Thomas Midgley discovered that by adding tetraethyl lead to the fuel, knocking was prevented. Aware that the lead-based additive would spur public protest, General Motors advertised the new additive as ‘Ethyl,’ and it was soon widely adopted, releasing vast amounts of lead into the air. Lead poisoning, destructive to the brains of children and linked with numerous diseases, soared globally. By 1985, approximately 5,000 Americans died each year from heart disease caused by leaded gasoline exposure, and as many as 68 million children were adversely impacted by breathing lead-contaminated air.
While Midgley was working to solve the knocking problem, General Motors gave him another puzzle. Conventional air condition and refrigeration systems used toxic and explosive chemicals to cool air, and GM wanted safe alternatives. Midgley’s almost immediate solution was chlorofluorocarbon (CFC), an organic compound of carbon, chlorine, and fluorine, later sold as Freon by Dupont. It was soon the go-to refrigerant for home appliances and cars, dominating the global market. But much like leaded gasoline, its profoundly injurious effects were soon discovered. By the late 1970s, the environmental damage of CFCs was becoming widely known, leading to their eventual ban. By 1985, the ozone hole over Antarctica had been definitively linked to the global use of CFCs.
Midgley died in 1944, perhaps by accident, after strangling in a harness of his own devising that allowed him to move from his bed to his chair after contracting polio. Had he lived to see the damage he caused the planet, this mild-mannered engineer would certainly have regretted his creations.
It is sadly true that a genius can invent beyond the scope of our collective wisdom, as the horrors of lead poisoning or the wanton loss of life in Hiroshima and Nagasaki illustrate all too well. One role of the trend watcher is to protect us from technology, to play the sceptic where advances are concerned. Who knows what the leaded gasoline or freon of the 21st century will be?