You may not be aware of it, but much of our success in business is predicated on something called “Metcalfe’s Law.” But Metcalfe’s Law has a dark side that keeps us from doing our jobs.
Metcalfe’s Law, named after Ethernet co-inventor Robert Metcalfe, states that the value of a network increases with the square of the number of users. In other words, a network of a thousand users would be a hundred times as valuable as a network of a hundred users.
And while computer scientists love to argue over minutiae such as whether the factor should be quadratic, exponential, logarithmic, factorial, or what have you, the point remains the same: The value of your network grows much faster than the size of the network itself.
Metcalfe came up with this in the early 1980s, but it’s still true today, according to Gartner, which is calling it the Economics of Connections.
But Metcalfe’s Law has a dark corollary, writes Michael Mankins in Harvard Business Review: “As the cost of communications decreases, the number of interactions increases exponentially, as does the time required to process them,” he writes. As of the 2010s, executives now receive more than 30,000 electronic communications a year, he writes, citing research from Bain & Co.
The result is the current deluge of email, voicemail, texts, and IMs under which we’re all drowning. This was predicted as long ago as 1982 by Peter Denning, president of the Association for Computing Machinery, in his seminal piece, “Electronic Junk” in the Communications of the ACM. While he didn’t coin the word “spam,” he certainly understood the concept, even though the nascent Internet at the time consisted of only 200 nodes.
“It is now trivial for any user to send copies of virtually any document to large sets of others,” Denning wrote. “The growth of new networks such as CSNET and USENET only adds to the heights of the waves of materials that try to flood any given person’s mailbox. It is clear that some attention must be paid to the processes of receiving information, and preventing unwanted reception.”
As we all know, nobody paid any attention to Denning then, and we are now reaping the consequences. In fact, technology has made it worse, Mankin writes. “With the introduction of Microsoft Outlook and other calendar programs, the cost of setting up a meeting plummeted,” he writes. “As a consequence, the number of meetings has increased and the number of attendees per meeting has exploded. Some 15% of an organization’s collective time is spent in meetings — a percentage that has increased every year since 2008.”
The particular problem with this aspect of Metcalfe’s Law goes back to a different law coined by another venerable computer industry pioneer: Amdahl’s Law, which states that if one part of a program can’t be parallelized, then no matter how much the rest of the program can be parallelized, the part that can’t will limit how much the program can be speeded up.
What makes this a problem in the context of the dark side of Metcalfe’s Law? Because the slowest, least parallel component is our human brain. “Amdahl’s Law demonstrates, algebraically, that increasingly the (non-parallelizable) human performance becomes the determining factor of speed and success in most any human-computer system,” write Randolph G. Bias, Clayton Lewis, and Doug Gillan in the Journal of Usability Studies. “Whereas engineered products improve daily, and the amount of information for us to potentially process is growing at an ever quickening pace, the fundamental building blocks of human-information processing (e.g., reaction time, short-term memory capacity) have the same speed and capacity as they did for our grandparents.”
In other words, machines get better and better at finding new ways to interrupt us, and we don’t get any better at dealing with the interruptions. Moreover, the constant stream of interruptions is making it more difficult to work, writes Mankin. The result is that the average manager now has less than 6½ hours per week of uninterrupted time to get work done, he notes.
“Meanwhile, the number of interactions required to accomplish anything has increased,” Mankins adds. “A recent CEB study found that 60 percent of employees must now consult with at least 10 colleagues each day just to get their jobs done, while 30 percent must engage 20 or more. The result? Companies take more time to do things. For example, it takes 30 percent longer to complete complex IT projects, 50 percent longer to hire new people, and nearly 25 percent longer to sign new customer contracts. And that’s just in the last five years.”
Certainly, we don’t want to disconnect from all of our connections. Start with the fact that—according to Metcalfe’s Law—that would radically decrease the value of our networks, writes Uber executive Andrew Chen. In any event, that ship has pretty much sailed. The best we can do is look at technology as a way to help bail us out.
- spending less time on each input
- disregarding inputs
- shifting the burden to others
- blocking reception
- creating specialized institutions to offload the work
“These strategies are uncannily similar to the ways we deal with Internet overload: we don’t read carefully, we disregard, we hand off tasks to others, we block reception, we filter, and we create institutions to share the burden (for example, spam-blocking services),” Denning wrote.
Similarly, Mankins writes in a different Harvard Business Review article, the same tools that let people easily set up meetings can also help track how much time an organization’s employees collectively spend in such meetings—and how much it could be costing the company. “This information can paint a vivid and revealing picture of an organization’s time budget,” he writes, noting that 15 percent of an organization’s time overall is spent in meetings. One company found that a regularly scheduled 90-minute meeting of midlevel managers cost more than $15 million annually.
As computers get faster, human performance increasingly is what’s holding up the works, Bias, Lewis, and Gillan point out. “The time required today to produce a document is dominated by the time required by the human operations; even reducing the time required for all of the computational operations to zero would make little practical difference in total task time,” they write. “Improvement can come only from improvements in the human operations in this particular hybrid computing-human system.”
Human operations is another term for workflow—how the business process transmits from one person to another. Consequently, any improvement in workflow, such as business automation, is going to have a much greater effect on the overall speed of the business process than any improvement in the underlying hardware or software technology.
“Many investments in new technology are essentially workarounds for bad behaviors or poor procedures for sharing information,” Mankins writes. “Were customer, financial and operational information readily available to all, for instance, the need for crowd-sourcing or reconciling data sets would be reduced significantly. Leaders should carefully assess whether to accept a bad behavior as given and invest in new technology to cope with it, or instead change the dysfunctional behavior.”
Simplicity 2.0 is where we examine the intricate and transitory world of technology—through a Laserfiche lens. By keeping an eye on larger trends, we aim to make software that’s relevant to modern day workers, rather than build technology for technology’s sake.
Subscribe to Simplicity 2.0 and follow us on Twitter. If what we’re saying piques your interest, head over to Laserfiche.com where you’ll see how we apply the lessons learned on Simplicity 2.0 to our own processes, products and industry.