When you think about data centers and computing, you probably look to companies like Google and Amazon, with machines sitting in fairly large buildings or rooms, running this enormous computing capacity. However, the reality is far worse.

Machine Learning, Room Overlay with Filter Options utilizing IBM’s Watson platform

I spoke with Enzo Greco, the Chief Strategy Officer for Nlyte Software, on how we can move this market into a more advanced and efficient state of operations. Most importantly, the scale, complexity, and optimization required to run a data center cannot be managed in a traditional way. At least, that’s what Greco believes when it comes to their business model in the age of IoT and Industry 4.0.  “

#1 –Location, Location, Location

Try and picture a computer server sitting in a warehouse or office over at Google or Amazon. With thousands of servers, where does the next one go? Or, how about the Golden 1 Center, home to the NBA’s Sacramento Kings? They have a Tier-4 data center that is now powering the entire arena. You can only imagine how much energy is used for cooling facilities like that.

“We all try to be efficient in our homes and offices when it comes to optimizing our air conditioning—you can imagine the complexity with that in a data center,” said Greco.

#2 –Availability Is Paramount

With traditional data center operations, it costs an exorbitant amount of money to run those SYSTEMS when it comes to capital outlay, ongoing electrical costs, and ongoing maintenance. So, how do we fix it? According to Greco, he believes that there is no better use than artificial intelligence (A.I.) and machine learning then in an environment like this.

“It’s all about availability, because there is a tremendous amount of scale and complexity with new types of equipment coming in,” said Greco. “Analytics thrives on a weak set of inputs, and having a good  system will determine the optimal outcome.”

#3 –Advanced Analytics

Availability is becoming paramount, and combined with the unique capabilities of an analytics system, these programs are able to determine an optimal outcome.

What Is “Watson?”

Think back to Jeopardy in 2011. Watson was the IBM supercomputer that beat the system. I was fascinated to learn that Nlyte’s software is built on top of the Watson platform.

According to IBM, Watson has the capability of reading over 40 million documents in just 15 seconds, ingesting digitized data and translates it into insights. Dr. Kyu Rhee, MD, chief health officer at IBM believes Watson helps humans make better decisions.

With it, Watson brings two major benefits:

  1.  Its success on Jeopardy sends a message throughout the industry. It did well because it had the ability to be fed any type of raw data, even if unrelated. “It was then able to make sense of it,” Greco explained. “When it comes to parsing through unrelated data, utilizing a good cognitive system like Watson, makes it easier to find those hidden patterns.”
  2.  It has the name recognition and credibility due to IBM’s backing. Analytics has a certain trepidation around it, specifically through customers in the market. “The credibility and backing helps to alleviate many of these trepidations,” said Greco.
#4 –Looking to A.I. and Machine Learning

Utilizing technology platforms like Watson, help to introduce technologies like A.I. and machine learning across industries. “Our customers tend to be data centers themselves—owners, operators, big enterprises,” Greco told me. “Our platform built on top of Watson, provides A.I. services to data center operators so they can optimize their operations.”

#5 –If You Build It, They Will Come

But, the data center market is beginning to move away from the traditional analytics model, shifting  their attention to specific use cases. “You’re now faced with questions like “how do I optimize my energy usage?” or “where are my workloads running?”

It’s no longer this “build it and they will come” approach, but rather, the specific use cases that dictate what the market wants and how it wants information delivered.

#6 –Workplace Atmosphere

If you were to take a look at the spectrum of those working in a traditional data center, you’d find those that are highly experienced and most likely from the “mainframe age.”

However, it’s time to mix things up a bit and inject a bit of the millennial “zing” to it. Companies like Facebook and Google are taking advantage of the tech-savvy generation, bringing them into the mix, alongside their experienced data center professionals.

But why? Well the CSO told me that millennials are able to provide a different perspective to customers as well as engage and connect with them in different manners than those who are trained to think procedurally and in numbers.

#7 –Shifting Business Models

In the data center market, there are two major business models to pay attention to:

Enterprise Data Center

Data centers like Bank of America, who own, operate, and pay for them directly, the focus is on managing the overall efficiency as well as they have in the past. Manually. Procedurally.

Whether it’s the Blockchain or any other type of new technology, you want a set of triggers and trip wires, so that these machines don’t overstep their bounds and regrettably, consume the data center.

Co-Location Market

When we are talking about the “co-location” market, we are talking about multi-billion-dollar players like Equinix, DLR Group, and RagingWire. In each of these business models, the data center is the business. There is no financial company or government institution at work. Why? “They are real estate operations that happen to own and operate data centers,” Greco told me. “These companies have a slightly different model because they are totally focused on availability but are especially focused on efficiency and optimization because that’s their business. They only profit if they are optimal to a very ‘Nth’ degree.”

The bottom line is that some of the more advanced companies out there are willing to adopt this technology and integrate A.I. into their operations. That’s the future of data center operations.

Drew Rossow is a contributing editor to Grit Daily. He is a criminal defense/internet attorney, writer, and adjunct law professor in Dayton, Ohio. Born and raised in Dallas, Texas. A Millennial, Rossow provides perspectives on social media crimes, privacy risks, Millennials, and business. Rossow consults for ABC, FOX, and NBC on the latest news in technology law.