It’s Time to Take the Edge Discussion Beyond Latency
We have started witnessing the changes being brought about by Edge Computing, as it is beginning to permeate our lives. It is changing the way we store and use our data, and addressing the need to move our storage and processing closer to the data generation points. By creating an edge we eliminate the to and fro transfer of data, and thus, reduce latency.
Presently, we mostly have centralized data centers and all the data is sent there for processing. This increases bandwidth usage and traverse time for data, which eventually spikes the cost and latency respectively. Let’s see an example for this- If you send certain data to someone across the room and your network does not have a data center in that city, it would require a round trip of a few hundred kilometers for the data to reach the recipient. This happens because it is not feasible for the network to have a data center in every city and town, so to address such inefficiencies, edge networking is evolving with more and more micro data centers.
Although latency is one of the biggest issues edge is resolving, it is not the only problem which will be resolved. Most networks have already started viewing edge as an integral part of the infrastructure that aims at improving the user experience and moves the data from the core to the perimeter.
Consider an oil rig, it produces a plethora of data, most of which are inconsequential, but this data is a sign that the systems are working properly. Now sending over the collected data to the mainland immediately is expensive and unimportant, where only daily reports are required. Hence the need for an edge data center in the functioning of an oil-rig is as important as, in a self-driving car where real-time data is needed to be processed, and failure of which can prove to be fatal.
Creation of local connections via edge and a step towards software-defined edge will also make the data centers flexible in a way that the applications will define the infrastructure of a data center, instead of the opposite that happens now. This is more promising than it sounds. There is a huge difference between operating one large site, and distributing that same load across 1000 sites.
When we transmit data through a bandwidth, to and from the centralized data center, it is not just time intensive but also cost intensive. By moving the data to the edge, this transit is minimized; hence, making the data cheaper.
Traditional data centers and cloud architectures are inherently centralized, which makes them more prone to attacks and power outages. Edge distributes the processing, storage, and application while also reducing the data transit where it is vulnerable to hacking. Thus, reducing the probability of any single disruption, causing any harm to the data.
Like all advancements, edge networking is a step forward from the technologies before. It is not an exclusive technology but a response to new technologies and encapsulates multiple layers of infrastructure. Edge is a pre-requisite for some upcoming and growing digital advancements that require low latency and close proximity to users. The much talked about technologies like 5G and self-driven cars rest on the edge. The speed, at which data needs to be processed for these technologies to come true, can be achieved only via edge. Also, the explosion of demand witnessed by the IoT devices needs the edge to see it through. IoT devices are creating a huge magnitude of data, which needs to be stored at multiple micro data centers, near the population centers, instead of a centralized data center.
We are creating quadrillion gigabytes of data every day, but that does not mean that we need to store all of it. We need intelligence close to where data is being generated, so that we can get the data to the right place, at the right time and act suitably on it.
Most of the discussion about edge currently focuses on latency. But, as the edge is becoming an integral part of the data center industry, it is evident that it can go well beyond just tackling latency. It is shaping the future of internet infrastructure by laying the groundwork for newer technologies and also addressing issues like cost, security, and scalability in the current model.
Leave a Reply