Five years from now, your data center might look very different—made up of a combination of homebrew and packaged servers distributed around the world, with processing switching automatically between locations depending on load, latency, and the cost of electricity at the moment.

That is, it might look that way if you could spare a few minutes from your current job to plan for it.

This is according to David Cappuccio, a Vice President, Distinguished Analyst and Chief of Research for the Infrastructure teams with Gartner Inc. He’s been traveling the world the past few months, giving webinars and presentations, and writing blog posts on the subject.

Cappuccio lists 10 trends that come together for his vision of the near future:

  1. There’s nonstop demand.

It’s no surprise to anybody that everything associated with the data center—except personnel, budgets, and salaries—is growing at a phenomenal rate. Server workloads are growing at an annual average rate of 10 percent, network bandwidth at 35 percent, storage capacity at 50 percent, and power costs at 20 percent. And workloads will only get worse when the Internet of Things really takes off, he says, predicting that by 2018 it will generate 33.6 zettabytes of traffic per month.

  1. Every business unit is a technology startup.

CIOs are no longer getting most of the responsibility for defining new business trends, Cappuccio says, noting that many different parts of the company are making IT decisions and IT acquisitions out of the purview of traditional IT. While that’s more chaotic, it also makes the business more agile, he says. But even if IT isn’t involved in the technology, it often still gets blamed if systems don’t work, he warns, so it needs to be aware of what’s going on.

  1. The Internet of Things is generating a “staggering” amount of data.

For example, a hospital may track the use of its hand-washing stations so that it can defend itself in a malpractice lawsuit months later, Cappuccio says. That’s useful data, but it needs to be collected, stored, and found again when needed.

  1. Software-defined infrastructure makes everything configurable.

The result could be rules-based workflow that determines the best time and the best location to run applications, based on how many servers you need, the time of day, and even the cost of electricity at different sites. This means IT departments could move workloads around the world based on business and performance needs transparent to the people using the applications, Cappuccio says.

  1. Integrated systems bundle everything together.

Instead of IT organizations buying components and putting servers together, they’re buying dedicated systems that have networking, storage, processing, and memory built in. Moreover, once an organization has decided on a particular system, it may well continue buying that system from that vendor rather than performing future evaluations, Cappuccio says.

  1. On the other hand, some organizations are looking at disaggregated systems.

Companies like Google and Facebook, which use massive amounts of hardware, are looking for the most efficient way possible to use that hardware because of economies of scale. So they’re buying individual components and designing their own systems, Cappuccio says. That also means companies could upgrade at the component level (e.g., processors or memory only), and only focus on essentials.

  1. Prescriptive analytics decides what should run where.

As analytics have become more sophisticated, we’re moving toward a time when software can decide where and when it should run based on business rules, business needs, and business risks. For example, with a storm coming, processing for an application can dynamically move elsewhere, resulting in a much more robust environment that isn’t dependent on centralized data centers, Cappuccio says. While IT has traditionally been nervous about systems that can automatically move programs or shut down servers, this is where the industry is going, he says.

  1. This results in IT service continuity.

Prescriptive analytics goes beyond disaster recovery and business continuity and could perform workload balancing as well, Cappuccio says. A website could be supported by multiple locations, and if one of those locations fails or gets too busy, the workload shifts and the customer won’t see a difference. “The perception of the customer is that the site is always available,” he says, which becomes increasingly important as users have more options.

  1. IT organizations will be able to support multiple needs.

For several months now, Gartner has been promoting the notion of “bimodal” or “ambidextrous” organizations, which provide both the reliability of traditional development and the agility of a startup as needed. “We need to do both on a continuous basis,” Cappuccio says.

  1. This is all going to require a different kind of IT person.

No longer will IT be able to depend on a collection of storage guys, networking guys, and server guys who only know their siloed part of the user experience. Instead, IT will need to cross-train its people to understand how various pieces of the company work together, Cappuccio says. That way, they can focus on solving problems rather than pointing fingers.

Organizations that currently have trouble finding the right kind of staff may worry about how they’re going to find new talent and how much it’ll cost, but that’s less of a problem, Cappuccio reassures us. “For high-end IT people, what motivates them is not money or accolades, but learning,” he says. The challenge in learning new things and making things work better becomes a motivational tool itself.

Cross-training also means training existing staff. And if some organizations are worried that they’re making their IT staff more marketable in the process, they’re looking at it the wrong way. “They’re yours to lose,” Cappuccio says. “If you create an environment where there’s always new things to learn, they’re going to stay.”


Simplicity 2.0 is where we examine the intricate and transitory world of technology—through a Laserfiche lens. By keeping an eye on larger trends, we aim to make software that’s relevant to modern day workers, rather than build technology for technology’s sake.

Subscribe to Simplicity 2.0 and follow us on Twitter. If what we’re saying piques your interest, head over to Laserfiche.com where you’ll see how we apply the lessons learned on Simplicity 2.0 to our own processes, products and industry.

Machine Learning

Learn how machine learning can be the driving force for digital transformation in your organization.

Listen Now

Related Articles

By David Chen, November 16, 2017

Digital transformation and artificial intelligence? As it turns out, they're two great tastes that taste great together. Here's how they help your company.

Read More

By David Chen, November 02, 2017

Customer experience includes the “journey,” the path customers take to interact with a firm. Digital transformation lets you streamline this journey.

Read More

By David Chen, October 19, 2017

Nearly everyone will tell you that to be successful in business, you need to be innovative. But what does being innovative really mean?

Read More