New technologies bring about new problems, new solutions, and more new technologies
Change is a part of any industry, driven by innovators who recognize the need to grow. This philosophy was elegantly underscored in a Wall Street Journal article titled “The Genius of the Tinkerer.” The author, Stephen Johnson, referenced what he called the philosophy of the adjacent possible.
“The strange and beautiful truth about the adjacent possible is that its boundaries grow as you explore them,” Johnson said. “Each new combination opens up the possibility of other new combinations.”
Johnson’s notion of adjacent possibility is a fitting description of how the data center industry has evolved and transformed itself in order to gain efficiencies in power, cooling, processing, and deployment.
No View, No Minibar, No Problem
In the mid-1990s, data center infrastructure started to change in terms of how it was utilized and deployed. The bulk of data center growth emerged in the form of enterprise data centers — specifically banking, finance, trading floors, ISPs, and health care. These facilities often contained smaller data rooms and data closets in retrofitted, high-rise buildings (aka carrier hotels). The terms “carrier hotel” and “colocation” were applied because these facilities acted like a hub where many telco lines converged. Some of these facilities sat right on top of large cable feeds, penetrating the U.S. shorelines from Japan to California or Europe to New York City.
Soon, the internet began to sprout servers like dandelions on a lawn, and enterprise organizations were knocking down walls to create smaller, on-premises replicas of larger data centers — the client/server movement had begun. In this mad rush to create and process information, security and data backup were largely considered marginal — until the events of 9/11, which exposed data center weaknesses. This stimulated the development of disaster recovery sites as a means to replicate and distribute data for near instantaneous retrieval from distant locations.
The new focus on disaster recovery was a fundamental shift from the mindset that physical data center locations be dictated solely by telco proximity or desired low latency for stock trades by locating facilities close to the trading floors. Even power costs were a secondary concern compared to losing millions of dollars in a single transaction due to low-latency issues. However, after securing the data’s integrity via distributed, mirror images, the concerns soon shifted toward power costs and optimization.
The Power Struggle
Power is a nondebatable element that every data center needs in order to process workloads. After the security box was checked-off, the focus turned to infrastructure power and cooling — internal issues that enterprises had to sort out. Colocation companies took advantage of this and began expanding to provide more efficient services for enterprise organizations to reduce their energy expenses. The power struggle was indeed shifting in the favor of colocation facilities right through the Great Recession.
Soon after, location became extremely important, and the focus was on fiber availability as well as tax advantages offered by local municipalities and individual states to entice companies to build within their borders. Advantage colocation! These facilities marketed themselves as efficient, cost-effective alternatives for on-premises data centers.
However, power is expensive no matter how close the data center is located to a power plant. Even colocation facilities began to pay more attention to power and infrastructure with the rise of social media and the IoT.
Innovations like free cooling and other alternative energy sources have been utilized to reduce electrical demand. Most recently, data center infrastructure management (DCIM) solutions started leveraging AI to provide more analytics and controls to increase efficiency. Considering how far the industry has progressed with efficient operations, is there still room for improvement?
Pushing the Envelope
Edge computing represents the next challenge to data center and infrastructure engineers. Why? Data centers have grown so big that, now, highly distributed, small deployments are preferred in many cases.
The low-latency requirement for new technologies means that 5G deployments on the edge will be the next wave of new facility builds.
Autonomous vehicles, drone deliveries, and wireless connections all drive a need for higher data speeds and more bandwidth.
But how fast do we really need our data to flow? Millisecond data speeds are no longer sufficient when an autonomous vehicle is trying to distinguish between an object and a person — in that case, every nanosecond counts.
From carrier hotels to vehicle-to-vehicle communications, the industry is still working to understand how these deployments will affect the power grid’s performance. Although the edge will have reduced power requirements for 5G networks, the distribution of those networks creates challenges for companies that deploy this technology. A highly distributed power structure will, by definition, be decentralized and require new technologies, such as next-generation power distribution units and related equipment, to manage remote locations.
The adjacent possibility conceptualizes that as you open one door and enter a room, three more doors present themselves. The data center industry is a real-life example — for every new technology, there’s a new problem and a new solution, which leads to new technologies, new problems, and new solutions.
Full online article available here.
About the Author
John Day is Founder and Chief Executive Officer of Gateview Technologies. He has been in the electrical and mission-critical business for nearly forty years. While John’s focus is on sales, he has also participated in an entrepreneurial capacity over the last twenty-five years. His expertise has been to energize organizations in need of leadership for growth and product development. John built many successful sales teams across several companies. His experience culminated in explosive growth at PDI which resulted in the sale of the company in 2007. John maintained his status as a shareholder and sales leader after the acquisition, and he further developed PDI’s customer base as he initiated its expansion into international markets in Europe and Asia.
Prior to joining PDI, John spent eighteen years with Schneider Electric. During that time, he held a variety of positions including Director of Marketing for the North American division of Telemecanique, channel management positions, and a variety of sales management roles. John’s experience at Schneider spanned the industrial, automation, construction and data center markets.