Connectivity in daily life is increasing, we have started with smart phones but in recent years the label of ‘smart’ has been tacked on to many more products, from watches to meters, even whole homes. New technology is constantly in development, driven by huge amounts of data of course. Inevitably, this is having an impact on data centres due to the growing volume of data and processing power required.
We’re in the midst of a data boom that started back in 2017 and continues well in to this year with data centres in London leading the way, followed closely by Frankfurt, Amsterdam, and Paris. Brexit, interestingly, has so far had no measurable negative effect on data service providers in London since Article 50 was triggered, in fact there has been an uptick in demand from hyper-scale data providers.
Machine learning and virtual reality, elements that more manufacturers are looking to incorporate into IoT’s (Internet of Things) standard offerings, will necessitate unprecedented amounts of computer processing, data, and connectivity. Their effect on infrastructure can already be felt with demand for data centres growing in non-typical business markets; these technologies are likely to be rolled out over the next few years, drastically changing the data centre landscape, even the data centre itself. New era data centres will need to be optimised around the requirements of machine-to-machine (M2M) capabilities, possibly unmanned and positioned closer to IoT networks, making it possible for software and analytics to affect changes when needed. This will also bring about an opportunity for design innovation to better serve the machine that is driven and defined by software.
This change is already taking place to a degree with edge computing emerging as a response to IoT’s widespread growth. Edge computing follows a different distribution pattern compared to traditional data centres; compute and storage capacity, in this instance, gets placed closer to the geographic locations of consumers allowing for faster processing that enables IoT. This is hence known as ‘edge’ computing compared to the ‘core’ processing method whereby data has to travel farther to reach the source. In 2016, an average of 5.4 million new devices were connected to the internet every day; it’s estimated that by 2020 the number of connected devices will hit 20.8 billion globally. There’s disagreement between industry experts regarding the best way to tackle such rapid growth, with some supporting edge computing to enable faster connection and processing, whilst others believe that it’s a short-term solution that doesn’t ultimately tackle unsatisfactory bandwidth infrastructure.
IoT growth will have a further effect on the way everyone (consumers, providers, data centres, etc) approach security. More connectivity means more opportunities and entry points for hackers to intercept data flow. Even stricter control and authentication measures will have to implemented to ensure safety of connection at all stages of the process. Privacy will pose additional questions with consumers requiring more transparency about their data’s use.
We’re living at the frontier of technological advancements and innovations are far outpacing the capacity of current infrastructure. As more devices, homes, and even cities, become part of the IoT, it’s clear that data centres will too have to adapt in order to deal with the influx of data and required processing capacity. What remains unclear is exactly the path this will take since the issues faced mandate a host of short and long-term solutions to be undertaken simultaneously.
Photograph by ColossusCloud