Edge computing is one of the biggest paradigm shifts for cloud computing in recent years. The concept of edge computing boils down to reducing “distance” between devices by moving them closer to the “edge” of their networks. This term can be confusing because the “edge” really doesn’t have a solid definition. The overall goal is to reduce long-distance communication between devices so that latency is reduced and the process is more efficient. You pull out the easy pieces which can be done on the hardware available on the edge. Edge computing can save time and money for the cloud service and the people using it.
Distributed computing is not a new paradigm, but the application of edge computing (arguably) is. The general principles work out the same, but the philosophy behind them is different. Multi-host redundancy, hybrid environments, on-premise services, etc. are all examples of distributed computing, but the intention matters to determine which end up as edge computing and which don’t.
While all edge computing is distributed computing, not all distributed computing is edge computing. The term is fuzzy if you don’t understand the intent. Let’s break down how to figure out what defines edge computing.
Defining Edge Computing
Edge computing exists because even though more and more data and computing has moved to the cloud, there are limitations (more about those later). Traffic on a Local Area Network (LAN) is effectively “free” and will be as fast as is possible for that client (as well as more secure). The internet is going to be the bottleneck for almost any network. To further complicate things, the internet is a series of Wide Area Networks (WANs) which vary in speed and quality. As the cloud fills up with increasing amounts of data and internet connections (supposedly) get faster, it can feel slower and slower getting from point A to point B, and that’s where edge computing comes in.
One of the most common examples of edge computing comes from the Internet of Things (IoT). If you have a smart camera which has motion detection, does it make more sense to run a basic process on the camera or to stream everything to the cloud to be analyzed? While this was a lot harder in the past, ARM chips are powerful (look at Apple’s M1 for what the higher end looks like), cheap, and efficient. That same smart camera can run the basic algorithm to pare down the amount of data which needs to be transferred.
Amazon Alexa devices also make use of edge computing to be practical. By offloading some of the more basic processing and features to the device, it reduces the load for Amazon and increase the response time for users. Video services seem like natural candidates for edge computing, but usually aren’t. They may use a content delivery network (CDN), but this is barely edge computing, even in the fuzziest of abstract descriptions.
Why Hybrid Environments Aren’t Always Edge Computing
Hybrid Exchange with Office 365 is arguably edge computing, but it’s more an incidental than a primary design feature. Many hybrid environments will end up using edge computing principles, and may even be examples of edge computing, but not always intentionally. A cache or CDN can be used for edge computing, but aren’t necessarily examples of edge computing.
Where is data going and what is the edge doing? Redundancy and reduction are different, but redundancy can serve as a reduction as well. If data is cached and the local copy is modified and synced, this can be seen as an example of edge computing if the purpose is to reduce inefficiency. Is the design pulling them closer on purpose or is it just a side effect?
Is the data being acted on before transmission to reduce volume or is it just a smart cache? Edge computing needs more computing than just blind forwarding to the cloud. A cloud backup isn’t (usually) edge computing, but a system which processes the data to make it more efficient is. If you’re routing internal emails quickly to avoid having them clog up the outbound pipes, you’re arguably working with edge computing. If everything goes out all at once, you’re just working as a cache.
The Evolution of the Cloud and Edge Computing
What defines the concepts behind edge computing makes more sense when you look at the evolution of the cloud. The modern cloud (arguably) began as a series of SaaS products which turned into PaaS, IaaS, etc. until it grew into XaaS (Everything as a Service). More and more things made sense in the cloud because computing resources got cheaper, as did raw bandwidth for the greater internet.
The problem was, the data grew faster than the bandwidth clients necessarily had. Users expected more, but their ability to provide the raw data didn’t always scale with their expectations. Families had HD TVs, but not necessarily the internet connection to really make use of them, especially with their consoles and phones all saturating the pipe as well.
The IoT escalated this. People were installing 1080p smart cameras around their houses for security which synced to the cloud. Amazon, Apple, and Samsung were creating voice assistants which had to use some of the most complex processing to analyze what was being said. Each device required some minimal amount of connection to stay functional, and more to be useful. Some tasks needed to be done in the cloud, but others were far simpler and just cost computing resources.
On top of all of this, bulk computing got cheaper, but the cheapest chips got smarter. Cheap ARM chips a decade ago had RAM measuring in the kilobytes, now you can buy chip for $1 which can run Linux or a full desktop replacement with a Raspberry Pi. The cost of computing went down for cloud providers, but also for people manufacturing the equipment. It made sense to put a small System on Chip (SoC) in devices to offload some of the processing from the cloud.
The Evolution of Technology and the Limits of the Cloud
Products migrated from on-premise solutions, into the cloud, and then back. As the cloud got smarter, it got smarter to offload processing from the cloud. Data is growing faster than the capacity to handle it. While it sounds like we’re returning to where we were, the difference is that the cloud still rules. None of this works without the cloud driving it all with modern edge computing.
On top of that, residential broadband has largely stagnated in the US and other countries. Some places continue to build newer, better connections, but you’re only as fast as the weakest link. It doesn’t matter how fast the data center is if the users can’t make use of it.
As mentioned before, the cloud has gotten more and more complicated. The algorithm to decipher your voice and turn it into a command for a device is no simple thing, but simpler, less accurate voice analysis has existed for ages. Some of the preparation or even processing can be done on a local device to reduce how much has to jump to the cloud. Certain features make sense to route locally instead of relying on the cloud.
Why Edge Computing Matters and Makes the Cloud Better
Why send a gigabyte of data for yes/no question if you can answer the question with a simple process? Even if a local process only reduces the processing by 10%, that 10% can be the difference between a user noticing the lag or feeling like the device is “instan”. Edge computing offloads the burden of the continual costs of data transfer and processing to the edges of the network as far as possible.
People want to use the cloud and get a seamless experience doing so. Edge computing means you get more performance, more efficiently, and without having to wait as long. Basically, you move some of the more common operations back out of the cloud, but you don’t really get a fully functioning on-premise solution either (usually).
The paradigm behind edge computing can be used to make products more reliable and efficient. Knowing what edge computing is used for and why can help you make better decisions with products and processes. Will a device work for you or will it tie up resources? If a client never wants to move away, will a cloud solution using edge computing principles actually make sense?
Edge computing enables you to do more with less more efficiently (but you’re still beholden to the cloud). If your users need to hit the cloud with every request, will it benefit the process or just slow them down? I’ve seen clients swear off moving to the cloud out of frustration because they opted not to use an option with edge computing and had a bad experience. A little bit of understanding could have saved their business and technical service provider a lot of headache and hassle with no real downside.
Now that you know what edge computing is, how can you make it work for you (or can you)? If a solution processes big data and you have rural DSL, is it going to make sense? Where does a product draw the edge in their edge computing and how can it be made to benefit you? Edge computing can change everything with the cloud, but only if you understand how to enable it.
Image by Free-Photos from Pixabay