IN THIS ARTICLE

    Subscribe to Our Newsletter

    Stay updated with the latest on web, mobile, and IoT, delivered weekly.
    Thanks for subscribing!

    Thanks for subscribing!

    Get ready for some great content.

    The cloud saw widespread pop-culture recognition in 2014 with Sex Tape, a movie about a couple, played by Cameron Diaz and Jason Segal, whose sex tape goes viral after accidentally being uploaded to the cloud. In a panic to escape their unfortunate circumstance, Cameron Diaz turns to Jason Segal, desperate: “You can’t get it down from the cloud?” Jason Segal cries out, “Nobody understands the cloud. It’s a mystery!”

    At the rate things are going, the edge will receive its own Hollywood shoutout in a few years time, although hopefully not under the same pretense; the edge, as with the cloud, is not that mysterious.

    What is Edge Computing?

    While the cloud refers to computing powered by large, distributed groups of servers, the edge refers to compute on the edge of the network, closer to or at the data source itself. And, while edge computing exclusively refers to compute at the ingress of the network, fog computing is inclusive of computing anywhere along the continuum, from cloud to the edge.

    edge compute

    Because we’re in the midst of a countless number of trends which all converge at the need for computing to move from the cloud to the edge, we should expect edge computing and fog computing to become a more popular topic of conversation in the near future.

    The Era of “Connected” and “Smart” Devices

    Devices which were previously unconnected to the Internet and lacked any computing resources, are now smart and connected. Smart mirrors, clothing, shoes, furniture, jewelry, pet wearables, IOT sensors, etc. are everywhere, and connected! Every week is filled with the media reporting on the discovery of another company working on another “smart / connected” device.

    Devices which were previously smart and connected are becoming even more powerful. Every year we see laptops, mobile phones, tablets, microcontrollers, and wearables that are faster and more advanced than the previous year’s model.

    Because all of these devices are becoming less expensive to manufacture, increasingly adopted across vertical markets, and more embedded into mainstream culture, the volume of these devices is drastically increasing. Top research firms such as Gartner and McKinsey predict a jump from the 6 billion connected devices we have worldwide today, to 20-30 billion by 2020. And, these are conservative figures compared to estimates of other industry leaders such as Cisco and Intel, who project twice as many devices.

    The end result is billions of devices connected to the Internet, possessing compute power, and generating a massive amount of data.

    Give me Realtime, or Give me Death!

    Realtime experiences are becoming the defining trait of modern applications. If any of our favorite chat, social, collaboration, gaming, or ride-sharing apps suffered a hit in perceived response time or latency, end users would quickly switch to a competing alternative. For example, when the popular messaging app WhatsApp went down for 4 hours, millions of users switched to Telegram during the outage window.

    While lack of realtime for end users in commercial use cases could mean death for the application (due to poor user experience), lack of realtime functionality in industrial use cases has the potential to translate into real safety issues.

    In industries such as manufacturing, oil, gas, utilities, transportation, mining, and the public sector, realtime technology is responsible for making critical health and safety decisions. For example, realtime is mission-critical for early warning systems, which are expected in realtime to collect a massive amount of sensor data, process it for anomalies using internal and external AI and ML-based technologies, and subsequently alert operators and admins of any potential risks or issues.

    The need for realtime becomes even direr as technology such as self-driving cars and the Internet of Recognition (IoR) become more prevalent. These devices depend on communicating with each other in realtime to detect, surveil, confirm, and prevent or avoid any suspicious activity or potential threats in the area.

    cowlar edge computing

    Cowlar takes livestock management to the edge with their connected cow collars.

    All Roads Lead to Edge Computing

    To rely on the cloud for processing is to potentially miss out on critical opportunities to act; whether the opportunity is to avert disaster, capitalize on business intelligence, or dodge a costly production mistake, the cloud’s response time may be too slow to take any effective corrective action.

    Latency aside, the cloud simply cannot keep up with the massive data volume and velocity generated by IoT. Gathering, processing, and storing data at fog nodes, gateways, and edge devices will greatly alleviate the network of potential bandwidth bottlenecks. Not every bit of data needs to be sent to the cloud, and the cloud doesn’t need to be consulted for every minor decision.

    Another benefit of acting at the edge is a more fault tolerant, elegant design, made possible by reducing the number of moving pieces and complexity in the application architecture. As a side effect, if upstream Internet service is interrupted, devices at the edge can continue to be autonomous.

    We Still Need the Cloud

    The cloud and the traditional ‘data warehouse’ model is still needed in situations requiring heavy computing and storage resources, such as big data analytics on historical data.

    Contrary to the title, in his talk The End of Cloud Computing, Peter Levine of the venture capital firm Andreessen Horowitz actually praised the cloud for having a crucial role in an edge architecture. Peter described the edge as following a loop of “Sense, Infer, and Act” (SIA); edge devices collect data via sensors, they then extract relevance from that data, and those insights are used to make decisions for the business or application. The cloud, he described, is a source for machine learning; it continually performs analytics on consumed edge data in order to arrive at smarter algorithms. The refined models then get propagated back to the edge devices, creating even tighter, more agile SIA loops.

    Tracing History

    As Peter points out, there seems to be a pattern in technology, one that alternates between periods of centralized and distributed computing. We’ve seen centralized mainframe servers of the 1960s and 70s evolve into a distributed client-server web architecture beginning in the 80s.

    Come the early 2000s, we saw a return to a more centralized architecture with the advent of the mobile-cloud web architecture. Now a balance back towards a more distributed architecture – in edge computing and the streaming web – is fast approaching.

    Artificial intelligence, virtual or augmented reality, streaming data analytics, and automated control loops are all becoming increasingly embedded into both consumer and industrial goods and electronics. We need the edge to deliver the realtime experience and reliability required by today’s use cases, as well as those of tomorrow.

    As the smarthome technology depicted in Hollywood movies continues to become a reality, the edge will become more commonplace. Maybe in Fast and Furious 15 we’ll see Jason Statham credit the edge for helping him maneuver through heavy traffic as he tries to shake the police off his tail. Or maybe in a future James Bond we’ll see Q instruct Bond on using edge drone technology to pinpoint specific targets out from a densely populated crowd.

    Resources
    Resources

    Building a HIPAA-compliant App

    Everything You Need to Know About Developing and Scaling a HIPAA-compliant App
    Download Now
    Building a HIPAA-compliant App
    More From PubNub