The person who made possible nearly everything we do today, Claude Shannon, would have turned 100 this year. And, chances are, you never heard of him.
Born in 1916, Shannon was arguably most known for coming up with the concept of the “bit”—being able to represent any kind of data in terms of 1s and 0s—in his seminal 1948 paper, “A Mathematical Theory of Communication.” “If the base 2 is used the resulting units may be called binary digits, or more briefly bits,” he wrote. “A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information.”
That concept not only allowed us to move to the electronic computer from the much slower, larger mechanical computer, by combining it with Boolean algebra, but also is what allows us to transmit music, movies, and images as easily as we transmit numbers.
In other words, without Shannon, we’d never have electronic documents or content management. He’s also responsible for the possibility of cat videos.
Shannon was also responsible for coming up with the concept of the “Shannon limit,” which addresses the issue of noise during transmission and determines the channel capacity of a particular transmission medium. “That humdrum phrase—‘channel capacity’—refers to the maximum rate at which data can travel through a given medium without losing integrity,” writes Siobhan Roberts in the New Yorker. “The Shannon limit, as it came to be known, is different for telephone wires than for fiber-optic cables, and, like absolute zero or the speed of light, it is devilishly hard to reach in the real world.”
Transmitting electronic signals runs into noise. Noise, to electronics, is like friction to physics; it interferes with movement from one place to another. So Shannon figured out how to transmit data despite noise.
“Shannon showed that any communications channel — a telephone line, a radio band, a fiber-optic cable — could be characterized by two factors: bandwidth and noise,” writes Larry Hardesty of the Massachusetts Institute of Technology, where Shannon worked from 1956 to 1978. “Bandwidth is the range of electronic, optical or electromagnetic frequencies that can be used to transmit a signal; noise is anything that can disturb that signal. Given a channel with particular bandwidth and noise characteristics, Shannon showed how to calculate the maximum rate at which data can be sent over it with zero error.”
The way to do that? Send the information with additional information. “Shannon showed that if enough extra bits were added to a message, to help correct for errors, it could tunnel through the noisiest channel, arriving unscathed at the end,” wrote George Johnson in Shannon’s New York Times obituary. “This insight has been developed over the decades into sophisticated error-correction codes that ensure the integrity of the data on which society interacts.”
(Incidentally, it’s interesting that the term “Shannon limit” hasn’t really entered pop culture the way some other tech terms have. People may complain that a useless meeting has a “low signal to noise ratio,” but how often have you heard someone say that they had too many meetings for them to hit their “Shannon limit”?)
While Shannon was a contemporary of people such as Alan Turing, and also worked on cryptography during World War II, he hasn’t been nearly as well known. But that started to change this year when organizations such as Bell Labs, where he worked from 1941 to 1958, held events honoring him and his accomplishments. On the actual occasion of his birthday, he even got his own Google Doodle, which also commemorated his interest in juggling.
Shannon was known for other inventions as well, such as developing what was arguably the first wearable computer. It was a small device that helped Ed Thorp—who had figured out a way to win $50,000 a weekend at blackjack in Las Vegas casinos by counting cards—perform similar feats with roulette. He also helped start the field of artificial intelligence by developing a mechanical mouse that learned to run a maze, and even helped develop children’s electronic toys. Sadly, he became reclusive in his later life and died at 84 in 2001 of Alzheimer’s disease.
As it happens, this attention to Shannon is all occurring at the same time that some vendors and technologies actually are approaching their Shannon limit. Fiber cables, for example, are coming up to their Shannon limit, writes Jeff Hecht in IEEE Spectrum. There is a limit to how much the power of a signal can be turned up before other effects that arise in the fiber generate enough noise to drown out the signal, he explains.
But Shannon’s ultimate legacy may have been the value of pure research, noted speakers at the Shannon conference, including Google chairman Eric Schmidt and Robert G. Gallager, a professor emeritus at MIT who worked with Shannon in the 1960s. “Managers need to meet deadlines, they need to do all these other things,” Gallager said. “But if they want to have really good people, to start to do things a little like what Shannon did, they need to give people time to think.”
“I’ve always pursued my interests without much regard for final value or value to the world,” Shannon agreed in a 1992 IEEE Spectrum profile. “I’ve spent lots of time on totally useless things.”
Perhaps we all need to do more that seems a little useless at the time. Maybe a cat video?
Simplicity 2.0 is where we examine the intricate and transitory world of technology—through a Laserfiche lens. By keeping an eye on larger trends, we aim to make software that’s relevant to modern day workers, rather than build technology for technology’s sake.
Subscribe to Simplicity 2.0 and follow us on Twitter. If what we’re saying piques your interest, head over to Laserfiche.com where you’ll see how we apply the lessons learned on Simplicity 2.0 to our own processes, products and industry.