Blog

Edge Computing and Digital Transformation: A perfect match for 2021?

EBS Integrator
Sep 20, 2021,

When it comes to Digital Transformation one thing to consider is the merits of decoupling your whole infrastructure. And taking the Edge Computing approach to solving your bandwidth and application latency problems. Let’s dive deeper into Edge Computing and how it fits together with Digital Transformation.

Deciding on the proper infrastructure for your enterprise or even application is a daunting task. For good reason too. So many things to consider, budget constraints, implementation hardships, IoT integration, scalability, planning, computation power requirements and far many things to list in one sentence.

All this ends up in oceans of information. One thing is certain however, this data is predicted to grow – in its amount and its complexity. As we’ve harped on multiple times before, BigData is no joke.

Proper analytics is key to solving the mystery of success. But with so much information to process, you might encounter a huge problem when running out of computation power to process all that Data.

The bigger the data the more computation power you need to make sense of it in a timely manner!

Unlimited Power of Edge Computing meme

This much.

One way to tackle this issue is piling up an ever-growing mountain of servers into one building or *core*, increasing your processing and computing power. Another way is relying on third party cloud servers and renting enough *horsepower*.

But there is always a limit, be that in physical hardware or simply not having enough bandwidth to transport this information, eventually clogging the streams.

Yet there is another way, a few actually, but the one we want to focus on today is Edge Computing.

What is Edge Computing

Edge Computing is a distributed information technology (IT) architecture where client data is processed at the periphery of the network, as close to the original source as possible.

What that means is that rather than sending information to a data centre or cloud *core* for processing, we send it to a closer hub, or gateway server found at the *edge* of internet.

Edge Computing Explained

We can either compute the data on the local premise such as a push of a few buttons at an ATM, send it to closer servers to be calculated, and only send the smaller bits of information back to the central hub.

To understand Edge Computing, picture the internet as a massive country.

Your cities are huge data centres/servers. Everyone is trying to access them via massive highways that serve as interconnections. To send a request, you need to drive down a road and enter the highway, drive all the way to the city *core* and back to get your result. This obviously takes time, which we’ll call latency or lag.

These highways are immensely robust, but they still have a limit. And with 5G getting momentum the number of devices in an effort to “drive through” (especially IoT terminals) keeps increasing. The consequence: huge traffic exchange that can “clog” these highways – we’ll call this congestion.

Now, if we deploy our server (the one powering our app or service) in the suburbs (or the edge of our cities), we bypass all those congested highways.

Essentially, “Edge Computing” places all the gear, data storage, computation power in smaller compartments, closer to the devices where the data is being generated/gathered.

Edge Computing use-case examples

There are dozens of different industry specific use-cases for edge applications or structures. Each is vastly different and is heavily dictated by the rules of their respective market.

But for a better understanding, let’s look at a few examples:

Retail Services

Think of having a few dozen CCTV cameras in various places, sending out data to a machine learning algorithm for better analysis of customer behavioural patterns.

Rather than sending literal terabytes of information to be processed by the central hub, thus clogging it, it sends that data to a gateway server, which has its own ML (machine learning) algorithm.

The gateway hub processes that data on the spot, stores everything else, and only sends relevant information to a central hub. In its turn, the central hub, receives information from various devices (like check out machines, internet surveys, previous data stores etc;) to make an overall analysis and provide the end-user (in our case store manager/planner) with the final result.

Far quicker than if it had to that journey and computation on the central server.

Manufacturing & Agriculture

Another example would be heavy manufacturing factories producing heavy goods. Such enterprises rely on a number of interconnected heavy piece machinery and maintenance devices that work in conjuncture in a closed off network, down on the factory floor, where its most needed.

Likewise, agricultural users use edge enabled devices (like soil and temperature detectors, alongside a plethora of other devices) and their tractors to maintain the most optimal planting and harvesting cycles.

Regulatory Public Services

When an event trigger occurs (or enough info got processed) all those systems feed data into a SaaS that  fills fill out the local health inspection registry.

This is no longer science fiction. Remember that detector that screams when there’s a gas leak? Or maybe wonder how your energy bill got generated? All those are standalone IoT devices that feed data into an app; and there’s a good chance they use edge computing.

Edge vs Centralized?

It would be unfair to say that Edge is the defacto best choice, as always it all comes down to your type of enterprise, to what needs you have, what customer pain points you experience etc.

The obvious counter to putting all your nifty servers and processors on the edge is centralizing your infrastructure instead. Be that via the cloud, third-party service, or in house mainframe.

Cloud Vs Edge

And even a few years ago the balance of Edge Vs Core was heavily in favour of cloud-based infrastructures. Though both types have their advantages, with the advent of IOT and increasingly powerful/low-cost micro-devices, the pendulum is steadily swinging into the Edge’s sphere.

It is simply more cost-efficient overall to decouple your infrastructure. The original cost might scare you, but trust us – it’s better to perform low-cost regular maintenance and replace hardware as it goes. Without worrying about latency issues, bandwidth limitations and if (it will) something goes down, better it be a smaller part than the entire system.

Decoupling leads to better overall security, and as we’ve talked about it before, it can rack up quite a price tag if you ignore Cybersecurity realities we live in.

Aside from a hefty monetary injection, we must admit that the deployment process can be somewhat tricky. Due to the relative novel approach to infrastructure deployment, there isn’t exactly a stable “GoTo” deployment approach yet. Companies are still figuring out what is the best approach and where to put down their “first step”.

For example leading companies like IBM and Linux’s EdgeX Foundry provide all the necessities from architecture planning, to installation and/or custom build devices. However, we consider Linux’s overall scheme to be possibly the best “template” when it comes to implementing Edge Computing into your enterprise:

Edge X foundry architecture template

Why Edge with DX?

It’s true, for small enterprises which do not deal with waves of information, edge computing might sound like a tough sell. If it were not for the promise of scalability.

Every business eventually grows or perishes, that is the nature of things, and if you’re going to grow, you will need to scale your operations. Integrating and planning “for the future” (considering new technologies), will make the eventual transition far easier and most importantly cheaper.

Speaking about the future, it is headed undoubtedly into the realm of ML/AI (Machine Learning Algorithms/Artificial Intelligence). And here is where Edge truly shines!

The need for real time response and massive amounts of processing power is paramount for any relatively complex algorithm. Pushing all that data through a single highway, will be a massive problem. Especially so when algorithms are taking up more and more parts of our daily life.

New technologies, (and that means new services, or business models) will rise tremendously in the next couple of decades. With the rise of smart cars, and smart houses, new businesses are on the verge of occupying completely new and unexplored niches.

Comic on the go Cloud Edge Computing

Digital transformation is all about embracing the new era of innovation, and Edge is yet another step towards better and more “high-tech” world.

Conclusion & Farewell

The important thing to take from this article is that Edge Computing is an overall improvement to the Cloud-based world we live in right now, and it’s the way forward with an ever-increasing reliance on IOT based services.

Soon, most if not all services in all likely hood will be based on Edge Infrastructure as a logical “next step” for the Cloud. New devices like smart self-driving cars, delivery AI controlled drones, health monitoring devices and much more push the need to start exploring new avenues of gathering data. And Data is the absolute key to success!

As we’ve harped on plenty of times before: choosing the right fit is the thing that can make or break your enterprise!

This comes in both as software architecture and how you go about building your app, to how you plan and structure your whole system architecture! We’re big fans of decoupling all together, a decentralized structure is much harder to take down in one blow!

Later, we’ll come back to this topic and dive ever deeper into how exactly you can take apart your enterprise and see how to adapt it into the Edge.

Tell us in the comment section what do you personally feel the future holds? Will we continue the way of Edge and higher degrees of decoupling, or will we eventually come back full circle and embrace a central core for everything?

Stay classy business and tech nerds!