The Complexity of Data Center Workloads according to Studies

According to the latest yearly global data center survey conducted by Uptime Institute , it is stated that data centers are turning out to be more intricate and continues to operate most of the workloads regardless of the assurances of convenience through hyper-converged infrastructure (HCI ) and automation, as well as how the cloud was designed to take control of workloads.

Most of IT load continues to operate on purpose built data center despite cloud adoption, applying more strain upon supervisors, wherein they need to handle workloads throughout the hybrid facilities.

With workloads such as data center security, artificial intelligence (AI) and different computer language coming to the lead, which indicates that the facilities are experiencing higher electrical power and air-cooling difficulties, considering that AI is incredibly processor-intensive. In which places pressure on data center supervisors and energy and cooling suppliers as well to stay up to date with the increased of demand.

In addition to all of it, everyone is having a hard time to acquire sufficient workers that has perfect knowledge and skills.

Here are the summary of Uptime Institute’s survey report:

  • Some of privately and largest data center facility still forms the base of commercial IT and is assumed to remain functioning 50% of all workloads in 2021.
  • The issue regarding the hiring and maintaining staff has simply gotten worse and it affects the majority of the data center industry. 61% of participants stated they had a problem keeping or hiring workers since last year.
  • Outages remain to be the most significant challenge for controllers. 34% of every participant on the survey, claimed that they encountered an electrical failure or serious IT service degeneration over the last year, while 50% had an outage or extreme IT service reduction in the last 3 years.
  • 10% of all participants mentioned their recent substantial electrical failure expense which is amounting to more than than $1 million.
  • A shortage of online presence, transparency, and liability of public cloud services is a primary issue for companies that runs mission-critical programs. A 5th of operators that they surveyed, stated they would most likely to set workloads in a public cloud since it provides more online presence. While 50% of those that are working with public cloud for mission-critical applications, added that they do not have sufficient online visibility.
  • Development in data center facility energy performance have deflated and still weakened a little over the last 2 years. The normal PUE for 2019 is 1.67.
  • Rack power consistency is expanding after a years of motionless or slight increases, persuading others to reconsider cooling programs.
  • Power loss was the most significant source of power outages. 60% of participants claimed their data center’s interruption could been hindered if they used a better management or configurations.

Data center providers are improving their reliability through “strenuous focus to power, facilities, connection and on-site IT duplication,” the Uptime report says. The resolution, however, is expensive. Data center administrators are getting dispersed flexibility via active-active data centers wherein a minimum of 2 operating data centers simulate data to each other. Uptime found as much as 40% of those surveyed were employing this technique.

The Uptime survey was administered in March and April of this year, with 1100 participants from more than 50 countries. They are then grouped into 2: the IT managers, owners, and handlers of data centers standards, the distributors, developers, and experts that support service the industry.

http://cascadedivide.com/contact