The only thing constant is change, and these 10 IT trends coming to data centers through 2017, including open standards, require preparation.
The mega trends coming to data centers over the next five years seem daunting — infinite infrastructure, relentless business demands and a major shift in control — but there are ways IT pros can prepare.
People on the business side expect the company’s internal data center infrastructure to have the same scalability and cost of Amazon Web Services’ cloud, according to an attendee at the Gartner Infrastructure Operations & Management Summit 2014 here this week.
That intersection of leading-edge IT and enterprise expectations underscores the theme of these 10 trends expected to hit data centers, as curated by Milind Govekar, managing VP at Gartner:
1. Open philosophies
Open development breaks the data center down into its lowest-level components, which fit together by open standards. Still, with less than 2% of enterprise applications designed for horizontal scaling, enterprise IT should avoid lifting legacy apps onto open infrastructure.
Instead, put new workloads on building-block infrastructure, and renegotiate your hardware contracts to get ready for more open-standard hardware and software.
This trend is nothing new, but the next five years will be transformative for IT automation, from opportunistic to systemic implementation.
The problem, however, is IT administrators love scripts. They love creating the best scripts, fiddling with scripts that come from colleagues, and leaving little documentation when they move on to another job. IT automation must evolve from scripting to deterministic (defined workloads for tasks) then to heuristic design (automation based on data fed in operations). There are banks today that use heuristic automation because they have all the hardware that you could want, Govekar said. But they lack the ability to automatically place workloads that best at any given moment.
Start down the heuristic path by appointing an automation leader in IT, automating script discovery and rewarding administrators for building resilient, structured scripts.
3. Software-defined everything
Software-defined means the control plane is abstracted from the hardware, and it’s going on with every piece of equipment a data center can buy. Software-defined servers are established, software-defined networking is maturing and software-defined storage won’t have much impact until at least 2017, Govekar said.
Don’t approach software-defined everything as a cost saving venture, because the real point is agility. Avoid vendor lock-in in this turbulent vendor space, and look for interoperable application programming interfaces that enable data-center-wide abstraction. Also, keep in mind that the legacy data center won’t die without a fight.
4. Big data
Big data analysis is used in a number of ways to solve problems today. For example, police departments reduce crime without blanketing the city with patrol cars, by pinpointing likely crime hot spots at a given point in time based on real-time and historical data.
Build new data architectures to handle unstructured data and real-time input, which are disruptive changes today. The biggest inhibitor to enterprise IT adoption of big data analytics, however, isn’t the data architecture; it’s a lack of big data skills.
5. Internet of Everything
Is IT in charge of the coffee pot? If it has an IP address and connects to the network, it might be.
Internet-connected device proliferation combined with big data analytics means that businesses can automate and refine their operations. It also means security takes on a whole new range of end points. In data center capacity management, Internet of Everything means demand shaping and customer priority tiering, rather than simply buying more hardware.
Build a data center that can change, don’t build to last, Govekar said.
6. Webscale IT
For better or worse, business leaders want to know why you can’t do what Google, Facebook and Amazon do.
Conventional hardware and software are not built for webscale IT, which means this trend relies on software-defined everything and open philosophies like the Open Compute Project. It also relies on a major attitude adjustment in IT where experimentation and failure are allowed.
Your workforce is mobile. Your company’s customers are mobile. Bring your own device has morphed into bring your own toys. The IT service desk can’t fall behind this trend and risk giving IT a reputation of being out of touch.
Bring data segregation — personal and business data and applications isolated from each other on the same device — onto your technology roadmap now.
8. Bimodal IT
No one’s congratulating IT on keeping the lights on and the servers humming, no matter how difficult it can be. Bimodal IT means maintaining traditional IT practices while simultaneously introducing innovative new processes — safely.
Take the pace layering concept from application development and apply it to IT’s roadmap, and find ways to get close to customers. Bimodal IT will make your team more diverse.
9. Business value dashboards
By 2017, the majority of infrastructure and operations teams will use dashboards to communicate with the outside world. Govekar made the analogy of the business-value dashboard vs. IT metrics to cruise ship reviews vs. cruise ship boiler calibration reports. They serve different purposes.
Evaluate business-value dashboards and complement them with IT staffers that speak the same language as your business stakeholders.
10. Organizational disruption
All the trends above feed shadow IT, where the business units steer around IT to gain agility.
Some IT teams are trying a new approach; rather than quash all shadow IT operations they find, these companies allow business users to set up shadow IT for projects and track the performance like a proof-of-concept trial. If the deployment succeeds, IT formally folds shadow IT into the organization.