Imagery Majestic – Fotolia
Data centers are the focal point of many businesses to effectively run vital applications, store critical data and provide important user services. But data center infrastructure is in a constant state of change.
New technologies consistently reshape the data center and its role in the business. At the same time, external forces such as the 2020 global COVID-19 pandemic have changed the way businesses, employees, partners and users operate as well as how data center technology functions that could resonate in 2021 and beyond.
Data center automation and remote management technologies aren’t new, but 2020 brought a new focus to unstaffed enterprise data centers. With many admins who worked from home or remote locations, and curtailed travel options, it became difficult — or impossible — to get IT staff on the same site with servers, storage and network gear.
Automation and remote management tools support large data centers, colocation data center sites and private cloud deployments. Systems management and data center infrastructure management tools are also nothing new, but these tools and practices now take on an entirely new importance.
In 2021, automation and remote management are core necessities — not just helpful options. These tools must handle a wide range of everyday administrative tasks on a massive scale such as:
Automation can handle myriad redundant tasks and reduce complex and repetitive processes to a simple self-service option that can be accomplished in just a few minutes. Automation does require ongoing attention and effort to maintain, but the time savings — combined with remote access — can allow IT teams to accomplish everything except particularly demanding tasks from almost any safe global location.
Continued reduced human contact will drive other automation technologies beyond 2021. As fewer humans are present in the data center, future data center designs can begin to optimize the infrastructure for machines instead of human interaction.
For example, robotic data center technologies now appear in liquid cooling systems, such as TMGcore’s Otto, which allow high-density system deployment and robotic server hot swap capability so admins can change out a server without the need to be on premises.
Automation will incorporate the power of AI and machine learning (ML) to manage and maintain the data center in 2021. Traditional telemetry such as logs and alerts use human analysis and intervention — an administrator receives an alert and then looks to tools and techniques to troubleshoot and resolve the alert. But this traditional human-centric approach is no longer adequate for large and complex data centers.
The enormous volume of telemetry that modern sensors and systems generate can be far too great to yield meaningful correlations with human analysis. AI and ML software tools ingest and process this telemetry and can readily spot correlations and deviations that point to operational bottlenecks and even predict potential problems before they manifest.
By combining AI’s analytical and predictive capabilities with automation’s orchestration functions, this tool set can actually drive the data center operations to accomplish the following: scale resources to maintain performance, troubleshoot potential problems, and make other proactive decisions to optimize and troubleshoot the data center in accordance with established business policies and practices.
Tools such as Splunk support AI in IT operations and predictive analytics and ML to prevent incidents from impacting the data center. More complex, cloud-centric tools include the MetalSoft automation and AI platform.
Organizations generate, store and move more data than ever before. Related technologies such as AI and ML demand enormous data volumes to analyze and correlate in order to develop business and IT intelligence. But this growing ocean of data must be managed carefully to limit volume, ensure timeliness, prevent change or deletion, and minimize movement across networks. It’s a tough challenge when experts expect 70% to originate outside of the data center by 2022.
The problem is not the volume of data. High-density storage technologies such as magnetic disk and solid-state storage can provide enormous volumes of affordable storage. The real problem is in data management, data protection according to business and regulatory requirements, and data movement from a source to an application that can process the data to derive meaningful results for the business.
There are two main ways to address data management concerns. One, organizations must invest in bigger and faster network connectivity in order to move remote data to and from the primary data center as needed.
Two, IT teams should implement a proactive data thinning workflow and perform more data analytics and processing at the edge and only return preprocessed or analyzed data sets back to the main data center.
2020 underscored the vital importance of remote management. When combined with the demand for superior data management, organizations should accelerate the adoption of remote data center technology in 2021; this includes edge, colocation and cloud.
Edge computing places compute and storage resources at — or as close as possible — to the point of data collection. The goal is to ease data movement needs and corresponding stress on the network and eliminate the latency involved with moving substantial data volumes over long distances. Generally, the business deploys and maintains any edge computing setups.
Colocation provides businesses with data center facilities and may also provide hosted servers, storage, networking gear for client businesses to use. The idea is to shed the enormous expenses involved in new data center builds and maintenance and, instead, rent data center space from a provider. Colocation is a popular means to create a disaster recovery installation, but organizations can also employ it more strategically to run applications and data at more remote locations around the world.
Cloud computing offers an array of resources and services that admins can use to set up operational infrastructures for even the most demanding applications. Cloud delivers strong self-service capabilities across a global footprint and allows users to add, change or remove resources and services at-will, so organizations only pay for the services they actually use.
Beyond the proliferation and expansion of these remote alternatives, 2021 sets the stage for technological convergence. Colocation and cloud all rely on strong automation and orchestration tools for remote access and control. They also require organizations to build additional smaller data centers in more distributed locations to meet any potential edge computing demands.
Data centers have a long history of utility-provided power use and often only adopt secondary power sources for short-term backup power. Ongoing changes to the global environment have renewed the focus on power availability and reliability.
As an example, wide-ranging California wildfires caused regional utility provider PG&E to impose rolling blackouts de-energize power lines in fire-prone areas. Unreliable utility power causes severe disruption to data center operations and can foretell profound problems for power availability.
In 2021, on-site power generation gains a new focus as businesses, colocation and cloud providers weigh the implications of utility power issues and rising transmission costs. Beyond utility disruptions, some world regions still have insufficient power generating capacity — a problem when many colocation and cloud providers seek to proliferate and expand their portfolio. More data center operators will consider on-site power generation that includes some mix of renewable power such as wind, solar and natural gas-powered fuel cells.
Need to work with remote Linux systems? A PowerShell module with file-handling functionality can help Windows shops looking for a…
Microsoft is expected to highlight the features in its new Windows desktop and server products as well as company efforts in …
Azure Stack HCI users now have more management features, better integration with Arc and the ability to run Windows virtual …
Though containers bring a lot of benefits, no container engine is perfect. Get an idea of what Docker troubleshooting involves, …
This year’s VMworld conference ran virtually from Oct. 5 through Oct. 7. Read the latest news and announcements about and from …
Tanzu integration and vSphere VM Service lets developers and admins spin up VMs and guest OSes as desired-state images in vSphere…
As the proliferation of cloud services continues, IT teams should revisit — and potentially revamp — their endpoint protection …
Cloudtamer, a cloud management startup, rebrands as Kion for the platform’s 3.0 release. New features include expanded support …
IT teams face a never-ending challenge as they try to secure data. When that data lives in the cloud, encryption is a key concern…
All Rights Reserved, Copyright 2000 – 2021, TechTarget
Do Not Sell My Personal Info
Push Notification Optouts