Back from the Edge – Issue 1

Back from the Edge – Issue 1
March 30, 2020 chris

Back from the Edge

Human to human communication has evolved from viewing a website or reading an email, to immersive group experiences, augmented reality and tactile internet

Us humans have innovated, crafting smart devices and machines whose concept of communications now challenges the status quo.

A combination of containerisation and micro-services allows geographical deployment of compute, pushing the service envelop outside the security of the data centre and into the bad lands, a remote place where power is king and the security daemons hunt the vulnerable.

Some say we are at the dawn of a new era of compute, regaining control, compliance and security from the public cloud, others that edge and cloud will work in tandem.  As an innovations consultancy AnotherTrail is working with both vendors and business to craft meaningful use cases in this space, and this series of blogs will share some of our experiences.

Welcome to the Edge.

The easiest way to explain edge computing is the iPhone Face-ID or facial recognition system.  This system automatically maps a complex pre-assigned scan of your face against what the onboard camera is registering and grants access accordingly. The decision is local, the phone is acting as an edge computer.

The first key edge attribute is speed(latency). In the Face-ID example we don’t want to wait for cloud to process our picture, the phone acts locally. Take this into areas like Gaming, Augment Reality, Next-Generation CDN and tactile Internet. The time to compute, render, respond are gated by human reaction times, delay an augmented image by >30ms and motion sickness may be a factor. Now machine to machine communications like autonomous vehciles requires even tighter timing frameworks, well beyond the capabilities of the public cloud service offerings, so edge becomes critical.

The second is volume. The act of sending zetta bytes for raw data from trillions of smart devices across existing Internet infrastructures and cloud communications stacks has many in the industry concerned. It wasn’t built for this, whilst cloud egress charges may also significantly impact the business viability.

Security and compliance enter the arena. Do I really want all this raw data transiting the internet to the cloud? Far better to process locally then discard.

An edge for all seasons

Within the business world reside a huge range of Edge architectures. Unlike the one-size-fits-most cloud model, edge computation is a mixture of business and technical considerations. Size and functionality impact power and cost. Communications and security joust with real estate and orchestration.  Protocols and standards dance with sector specific deployments and overall service visibility.

Here are some of the basic architectures we’ve encountered recently.

  • The simplest is the Linear model, where IoT or IIoT devices streams are aggregated by edge devices, however raw data is still transferred to centralise cloud compute.
  • Bespoke PLC is the ability to deploy containerised protocol engines(Modbus , OPC etc) linking 3rd party equipment securely across corporate LANs to designated offload locations.
  • Stepping up a level is Near Data AI where cloud orchestration combines with edge compute to dynamically manage suites of sector specific AI applications, processing raw data into actionable data locally.
  • The micro-data centre or regional compute comes next, in effect raw data transit to localised processing.
  • Existing data centres  controlled by a geographical orchestration engine enabling compute to be positioned in the most appropriate location from say a latency prospective.

Finally, we cannot exclude the impact of 5G. Whilst edge computing can embrace many communications forms like Wifi, LoRaWan etc. all act as mere conduits. The emergence of SD-WAN saw embedded deep packet inspection techniques, to map basic QoS to application flows allowing more efficient use of variable WAN resources.

But what about devices. 5G network slicing allows a mobile operator to create specific virtual networks based on say latency or bandwidth of the overlaying service/application between device and edge compute . Take gaming. By crafting a specific slice, operators can identify traffic type and process locally to deliver a more engaging experience.

Next time we will investigate the emerging edge markets and consider some real uses cases.

For more information please contact









Leave a reply

Your email address will not be published. Required fields are marked *


Social media & sharing icons powered by UltimatelySocial