{IBM.com} Learn the genesis behind the world’s first cognitive DCIM solution

You are currently viewing {IBM.com} Learn the genesis behind the world’s first cognitive DCIM solution

April 27, 2018 | Written by: Amy Bennett

Data, it’s the foundation and the DNA building blocks of technology — and the power behind the IoT.  When it comes to data, IT professionals are always concerned with knowing. How much do they have? What are the best methods to collect data? What is the absolute value? How can you leverage it to improve customer experiences? And what are the best methods to analyze it for operational visibility?

When it comes to data center operations, these questions become critical because the answers keep our applications flowing, sending information to millions of hungry servers and mobile devices around the world. But keeping these devices fed is becoming a matter too complicated for human influence. Therefore, a new process is needed to ensure data continues to flow.

How did we get here? A brief history

In started in 1943, when the U.S. Army needed to calculate complex wartime ballistics tables. To do so, they created the ENIAC (Electronic Numerical Integrator And Computer). Who could have foreseen then that the world would grow to depend on this “logical language” e.g. data?

IBM picked up the data gauntlet and refined the computer, at a high cost from both a financial and resource perspective. Then, in the 1980s, the IBM PC changed everything with a small machine. This machine could not only process information faster than the mainframes of the 1960s, but it could also hook up to the TV set, process text and store more words than a huge cookbook.

Today, IBM’s legacy of producing valuable data is personified within every data center distributed across the planet. These data centers are growing fast. In order to maintain all the servers, it takes an ecosystem of power distribution devices, cooling technologies, data backup applications, security software, backup generators and batteries. And it keeps getting more complicated!

It all comes full circle

Every one of these aforementioned data center components can be further broken down into many sub-components, plus all those non-physical virtual servers. Exacerbating the situation is the data center sprawl that is creating mini data centers. These data centers contain edge devices, compartmentalized pop-up data centers and thousands of colocation facilities. Now, we have come so far that we are back to the original conundrum faced by the Army in 1943 — how to effectively process an unruly amount of information in a smart and quick manner? Once again, humankind turns to the computer: Introducing the age of Artificial Intelligence (AI).

AI has the ability to take an ever-growing pool of data — from all distributed and virtual sources — and synthesize it into an understandable format within minutes. And once again, IBM took the lead with the development of Watson, an AI supercomputer. It beat 74-time-straight Jeopardy! champion Ken Jennings in a human-vs.-machine showdown on primetime television. But Nlyte Software knows that Watson’s AI abilities are not merely a game. When applied to the data center information conundrum, Watson has the ability to provide a cognitive search and analytics platform to connect and analyze all that distributed data to improve decision-making and business outcomes. Behold, a new member of the data center team, one that never takes a vacation or your lunch from the breakroom.

Nlyte teams with IBM Watson for first cognitive DCIM solution

Nlyte Software, founded in 2004, provides a data center infrastructure management (DCIM) solution (energy, humidity, thermal, etc.). They help organizations around the world manage infrastructure in their own data centers, colocation, and managed service facilities.  As part of the DCIM family, Nlyte Energy Optimizer (NEO) provides real-time monitoring, alarming, trending, and power systems analysis of both IT and facilities infrastructure.

Nlyte Software is combining NEO with IBM Watson’s AI abilities, to provide customers with a new level of operational comprehensiveness. It is in the form of a cognitive solution that provides current analysis of total operations and also future insights into device failures.

Applying machine learning to improve data analysis and take action

The union is accomplished by IBM’s Predictive Maintenance and Optimization (PMO) solution within NEO. PMO enables asset-intensive organizations to apply machine learning and analytics to improve maintenance strategies. PMO will take data streamed from NEO and apply pre-determined patterns. The resulting analysis will be used by NEO to produce data center-specific reports or take action, such as controlling set points on thermal equipment. Nlyte will provide implementation and support expertise for the combined NEO and PMO product.

In essence, Nlyte will integrate PMO to do the magic in the middle. PMO will provide the predictive and AI capabilities needed to take the data to the next level of insights to provide increased value for customers.

NEO’s AI-infused abilities better support three data analysis pillars

With PMO added to the NEO, all three data analysis components are covered:

  1. Collection. Capturing data from all distributed silos such as servers, sensors, HVAC, building monitoring software, PDUs, processors and many other points.
  2. Analysis. Advanced content analytics enable data center managers to understand not just what happened, but also how and why.
  3. Action. Refining data into a visual state so team members may quickly comprehend current conditions as well as increase operational efficiencies and cost savings.

Learn more

Get more information on IBM Predictive Maintenance & Optimization.

Contact Nlyte for information on this partnership, please contact info@nlyte.com or call (650) 642-2700.

By Amy Bennett for IBM