Will we rebuild 2 Tier Architectures for the Cloud to deal with Latency?

Will we rebuild 2 Tier Architectures for the Cloud to deal with Latency?
April 16, 2017 Etienne Coulon

This article highlights an important question that will pop up soon. Maybe we’ll re-invent what came 30 years ago.. a 2 tier architecture w/ front-end and back-end servers? The Cloud being the Back-end.

IT is a wheel..from Mainframe to mini, from mini to PCs and back-end, two tier architectures, distributed systems, cloud..and now?

source: http://formtek.com/blog/edge-computing-cloud-latency-cant-support-real-time-iot-and-ai-apps/

Edge Computing: Cloud Latency Can’t Support Real-Time IoT and AI Apps

By Dick Weisinger

Is the Cloud the ultimate solution?  Many of the benefits of the cloud can be alluring. Security has long been a pain point, but cloud security is increasingly less of an issue, and security has been steadily improving.

One issue remaining issue with the cloud that won’t go away anytime soon is latency.  Cloud apps are slow compared to apps that run on local machines.  Network latency, the time it takes data to travel from devices to the datacenter and back, is a problem for any application that needs immediate feedback of information.

Peter Levine, general partner of Andreessen Horowitz, told the Wall Street Journal that “I have this theory that cloud computing goes away in the not-too-distant future. I believe there is now a shift, a return to the edge where computing gets done much more at the edge than in the cloud. The Internet of Everything, specifically if you think about self-driving cars, is a great example of this. A self-driving car is a data center on wheels. And it has 200-plus central processing units. It’s got to do all its computation at the endpoint and only pass back to the cloud, if it’s connected at all, important bits of curated information.”

Deepu Talla, VP and General Manager of Mobile at Nvidia, said that “by 2020, there will be 1 billion cameras in the world doing public safety and streaming data. There’s not enough upstream bandwidth available to send all this to the cloud. Latency becomes an issue in robotics and self-driving cars, applications in which decisions have to be made with lightning speed. Privacy, of course, is easier to protect when data isn’t moving around. And availability of the cloud is an issue in many parts of the world where communications are limited. We will see AI transferring to the edge.”

Thomas Bittman, analyst at Gartner, wrote that “there’s a sharp left turn coming ahead, where we need to expand our thinking beyond centralization and cloud, and toward location and distributed processing for low-latency and real-time processing. Customer experience won’t simply be defined by a web site experience. The cloud will have its role, but the edge is coming, and it’s going to be big.”


Leave a reply

Your email address will not be published. Required fields are marked *


Social media & sharing icons powered by UltimatelySocial