fbpx
Why do we need Edge Computing for a sustainable future?

Why do we need Edge Computing for a sustainable future?

Centralized data centers consume a lot of energy, produce a lot of carbon emissions and cause significant electronic waste. While data centers are seeing a positive trend towards using green energy, an even more sustainable approach (alongside so-called “green data centers” [1]) is to cut unnecessary cloud traffic, central computation and storage as much as possible by shifting computation to the edge. Ideally, an Edge Computing strategy harnesses the power of already deployed available devices (like e.g. smartphones, machines, desktops, gateways), making the solution even more sustainable.

Why do Digitisation and IoT projects need to think about sustainability now?

Huge centralized data centres (cloud computing) have become a critical part of the infrastructure for a digitalized society. These large central cloud data centers produce a lot of carbon emissions, electric and electronic waste. [2] The share of global electricity used by data centres is already estimated to be around 1-3% [3] and data centers generate 2% of worldwide CO2 emissions (on par with the aviation industry). [4]

54% of which are caused by the cloud data centers of the big hyperscalers (Google, Amazon, Microsoft, Alibaba Cloud). [5] On top of this, providing and maintaining cloud infrastructure (manufacturing, shipping of hardware, buildings and lines) also consumes a huge amount of greenhouse gases [3] and produces a lot of abnormal waste (e.g. toxic coolants) at the end of life. [6]

sustainable edge computing

Bearing that in mind, the growth forecasts for digitization, IoT, and Mobile [7] are concerning. The steady increase in data processing, storage, and traffic in the future, comes with a huge electricity demand for this industry. [8] In fact, estimations expect the communications industry to use 20% of all the world’s electricity by 2025. [9]

sustainable edge computing

Shifting to green energy is a good step. However, a more effective and ultimately longer term solution requires looking at the current model of data storage, filtering, processing and transferal. By implementing Edge Computing, we can reduce the amount of useless and wasteful data traversing to and from the cloud as much as possible, thus reducing overall energy requirements in the long term.

What is Edge Computing?

While until recently 90 percent of enterprise data was sent to the cloud, this is changing rapidly. In fact, this number is dropping to only 25 percent in the next 3 years according to Gartner. By then, most of the data will be stored and used locally, on the device it was created on, e.g. on smartphones, cars, trains, machines, watches. This is called Edge Computing. Accordingly, edge devices need the same technology stack (just in a much smaller format) as a cloud server. This means: An operating system, a data storage / persistence layer (database), a networking layer, security functionalities etc. that run efficiently on restricted hardware.

As you can only use the devices’ resources, which can be pretty limited, inefficient applications can push a device to its limits, leading to slow response rates, crashes, and battery drain.

edge device architecture

EDGE DEVICE ARCHITECTURE

Edge Computing is much more than some simple data pre-processing, which takes advantage of only a small portion of the computing that is possible on the edge. An on-device database is a prerequisite for meaningful Edge Computing. With an on-device database, data can be stored and processed on the devices directly (the so called edge). Only useful data is sent to the server and saved there, reducing the networking traffic and computing power used in data centers tremendously, while also making use of the computing resources of devices which are already in use. This greatly reduces bandwidth and energy required by data centers. On top, edge computing also provides the flexibility to operate independent from an Internet connection, enables fast real time response rates, and cuts cloud costs.

Why is Edge Computing sustainable?

Edge Computing reduces network traffic and cloud data center usage

With Edge Computing the amount of data traversing the network can be reduced greatly, freeing up bandwidth. Bandwidth refers to the transmission speed of data on the network. While maximum speeds are published by the network operators, the actual speed obtained in a given network is almost always lower, because the bandwidth is shared and limited. The more data transferred at any given moment, the slower the network speed. Data on the edge is also much more likely to be used, and then (due to restricted devices size) deleted when it is no longer useful.

Edge computing is optimized for efficiency

Edge “data centres” are typically more efficient than cloud data centres. As described above, resources on edge devices are restricted – as opposed to cloud infrastructure, edge devices do not scale horizontally. Therefore, every piece of the tech stack is – ideally – highly optimized for resource efficiency. Any computing done more efficiently helps reduce energy consumption, especially taking into account the huge number of devices already deployed (number), the worldwide impact is significant.

With Edge Computing you can put already deployed hardware to better use

On top, there is a realm of edge devices already deployed that is currently underused. Many existing devices are capable of fairly complex computing; when these devices send all of their data to the cloud, an opportunity is lost. Edge Computing utilizes existing hardware and infrastructure,  taking advantage of the existing computing power. If these devices continue to be underused, we will need to build bigger and bigger central data centers, simultaneously burdening existing network infrastructure and reducing bandwidth for senselessly sending everything to the cloud.

Cloud versus Edge: an Example

Today, many projects are built based on cloud computing. Especially in first prototypes or pilots, cloud computing offers an easy and fast start. However, with scale, cloud computing often becomes too slow, expensive, and unreliable. In a typical cloud setup, data is gathered on edge devices and forwarded to the cloud for computation and storage. Often a computed result is sent back. In this design, the edge devices are dumb devices that are dependant upon a working internet connection and a working cloud server; they do not have any intelligence or logic of their own. In a smart home cloud example, data would be sent from devices in the home, e.g. a thermostat, the door, the TV etc. to the cloud, where it is saved and used.

Cloud vs Edge

If the user would want to make changes via a cloud-based mobile app when in the house, the changes would be send to the cloud, changed there and then from there be sent to the devices. When the Internet connection is down or the server is not working, the application will not work.

With Edge Computing, data stays where it is produced, used and where it belongs – without traversing the network unnecessarily. This way, cloud infrastructure needs are reduced in three ways: Firstly, less network traffic, secondly, less central storage and thirdly less computational power. Rather, edge computing makes use of all the capable hardware already deployed in the world. E.g. in a smart home, all the data could stay within the house and be used on site. Only the small part of the data truly needed accessible from anywhere would be synchronized to the cloud.

Cloud vs Edge

Take for example a thermostat in such a home setting: it might produce 1000s of temperature data points per minute. However, minimal changes typically do not matter and data updates aren’t necessary every millisecond. On top, you really do not need all this data in the cloud and accessible from anywhere.

With Edge Computing, this data can stay on the edge and be used within the smart home as needed. Edge Computing enables the smart home to work fast, efficiently, and autonomous from a working internet connection. In addition, the smart home owner can keep the private data to him/herself and is less vulnerable to hacker attacks. 

How does ObjectBox make Edge Computing even more sustainable?

ObjectBox improves the sustainability of Edge Computing with high performance and efficiency: our 10X speed advantage translates into less use of CPU and battery / electricity. With ObjectBox, devices compute 10 times as much data with equivalent power. Due to the small size and efficiency, ObjectBox runs on restricted devices allowing application developers to utilize existing hardware longer and/or to do more instead on existing infrastructure / hardware.

Alongside the performance and size advantages, ObjectBox’ Sync solution takes care of making data available where needed when needed. It allows synchronization in an offline setting and / or to the cloud. Based on efficient syncing principles, ObjectBox Sync aims to reduce unnecessary data traffick as much as possible and is therefore perfectly suited for efficient, useful, and sustainable Edge Computing. Even when syncing the same amount of data, ObjectBox Sync reduces the bandwidth needed and thus cloud networking usage, which incidentally reduces cloud costs.

Also coming soon ObjectBox time series which will provide users an intuitive dashboard to see patterns behind the data. It will further help users to track thousands of data points/second in real-time

How Edge Computing enables new use cases that help make the world more sustainable

As mentioned above, there are a variety of IoT applications that help reduce waste of all kinds. These applications can have a huge impact on creating a more sustainable world, assuming the applications themselves are sustainable. Three powerful examples to demonstrate the huge impact IoT applications can have on the world:

1) Smart City Lighting: Chicago has implemented a system which allows them to save approx. 10 million USD / year and London estimates it can save up to 70% of current electricity use and costs as well as maintenance costs through smart public lighting systems. [10]

2) Reducing Food Waste: From farm to kitchen, IoT applications can help to reduce food waste across the food chain. Sensors used to monitor the cold chain, from field to supermarket, can ensure that food maintains a certain temperature, thus guaranteeing that products remain food safe and fresh longer, reducing food waste.

3) Reduce Water Waste: Many homes and commercial building landscapes are still watered manually or on a set schedule. This is an inexact method of watering, which does not take into account weather, soil moistness, or the water levels needed by the plant. Using smart IoT water management solution, landscape irrigation can be reduced, saving water and improving landscape health.

These positive effects are all the more powerful when the IoT applications themselves are sustainable. 

The benefits of cloud computing are broad and powerful, however there are costs to this technology. A combination of green data centers and Edge Computing helps to resolve these often unseen costs. With Edge Computing we can reduce the unnecessary use of bandwidth and server capacity (which comes down to infrastructure, electricity and physical space) while simultaneously taking advantage of underused device resources. ObjectBox amplifies these benefits, with high performance on small devices and efficient data synchronization – making edge computing an even more sustainable solution.

Connecting database performance and business value – a fast edge database is a money saver

Connecting database performance and business value – a fast edge database is a money saver

We frequently get asked:   “Why does database performance matter?” “What is the business value of database speed?” 

As a developer, it seems clear that database performance matters. At the very least, a fast database that gives you out-of-the-box speed saves time and nerves during development. Any piece of the tech stack that works superfast makes a developer’s job easier. But there is more to it. And in the following, we will reason why and how database performance impacts businesses to hopefully inspire ideas on how to quantify this for your business case.

Data should be available when need where needed

We all dream of a future transformed by data. Cars that drive themselves to be repaired before a failure occurs. Fridges that are restocked while we are at work. Reducing resource waste to an absolute minimum. Building sustainable cities and communities.[1] It is truly amazing what is possible today…

database performance business value

Then reality hits: Before you can implement amazing solutions to make the world a better place for everyone, someone needs to solve the technical challenges, including hidden requirements. For example: you need the necessary data, and you need it available when needed where needed. This often isn’t that simple. Data persistence, database speed, and data synchronization are typical non-functional or “hidden” requirements. These are prerequisite technologies to allow the application to access, process and possibly depict the data required to answer a request (from another application or from a user), and thus enable the functionalities /  features. All in all, this is a pretty fundamental requirement. And it pays off to build your app on top of a solid foundation. Because, if you built your application on a solid foundation, every feature you dream up, no matter when,  and any next feature will be easier and faster to implement. 

Functional and non-functional requirements – the hidden challenges of your IoT project

IoT project hidden challenge

While you need data in any application, most often no one will write down where and how to handle it  as a user story or requirement. As opposed to features, e.g. “being able to search for names in the address book”, data persistence, database speed, and often even data synchronization are “hidden requirements”. Data is just expected to be available where needed when needed. Whether  the data you need really will be available when you need it, depends strongly on the database the application is using and and where this database runs. On top, the mechanisms you employ to exchange data between different devices (end devices, servers, ….) matter.

Hidden requirements are one of the major reasons why the Industry 4.0 dream is still in many respects a dream and not a reality – in Europe at least. Despite it being a topic for more than 10 years. [2]

Database performance 

What is a database?

A database is a piece of software that allows the storage and systematic use of digital information. A database typically allows developers to store, access, search, update, query, and otherwise manipulate data in the database via a developer language or API. These types of operations are done within an application, in the background, typically hidden from end users. Most applications need a database as part of their technology stack.

What is database performance?

We like and therefore use the following definition from Craig Mullins (2002): “Database performance can be defined as the optimization of resource use to increase throughput and minimize contention, enabling the largest possible workload to be processed.” [3]

Why does it matter if the database runs on the edge or in the cloud?

An edge database holds data on the (end) devices, where the data is used – and typically additionally sends some parts of the data to a central place like an on-premise server or the cloud. As opposed to this, a server / cloud-based database holds all data on the server / in the cloud. Where the data sits, determines from where, when and how it can be accessed. If all data is on a central server or the cloud, the prerequisite to accessing this data is a working network connection.

Online

Offline

It follows that edge applications are based upon a distributed computing paradigm, allowing edge devices to be autonomous. On the other hand, cloud-based applications are based on the centralized computing paradigm, where one central instance is in charge, with all other devices being dependent upon this central instance. This significantly affects the response time of the application, the availability of the application, and last not least the bandwidth needed for the application, which also translates into cloud costs.

database performance business value

Location matters: while a fast database gives you fast response times, if the database sits in the cloud and needs to be called from edge devices, you need to factor in  the duration it takes to request the data and get a response. And with any networking you cannot guarantee response times or ensure it is always available. While this is not the database performance itself, it highly affects application performance. 

The impact of database performance on your business

Database performance matters. Whether your solution needs the speed, because of the necessity to re-act in (near) realtime, or to keep your users (customers, employees, …) happy, productive, buying, or just to save costs for stronger edge hardware and the cloud. “Considering that even a single moment of latency or downtime can cost companies thousands of dollars, the speed advantages of edge computing cannot be overlooked.” [4]

The necessity of database speed for mission-critical, security relevant, (near) real-time functionalities 

If you need near real time functionalities, every piece in the tech stack matters, but the database has a particularly strong impact on the response rates of your application. Consider autonomous driving, healthcare and security applications, or IIoT solutions for production lines: Any application supporting such a scenario needs to respond reliably with speed. “This is not the same as a lag in loading your favorite cat pictures. A lag in a moving vehicle scenario is a matter of life and death.” [5]

Accordingly, if end devices like cars, smartphones, health trackers, machines on the factory floor are involved, a purely cloud-based application is not an option. Data needs to be stored and used on the devices directly. Thus, an edge database is necessary. Ideally, an extremely fast one.

Examples of use cases with a need for database speed

Autonomous driving capabilities are a special edge computing case that requires significant compute power to run the algorithms in real-time within the control unit of the car. As can be easily deducted from first-hand driving experience, during this kind of constant information processing and instantaneous decision making, every millisecond counts. Information processing speed and reliability (guaranteed QoS parameters)  is of the essence for autonomous driving.

Moving to a purely monetary example, let’s consider roadside tolling. In roadside tolling, the edge devices on the side of the road need to process the information from a moving vehicle in order to identify the car, bill according to usage, and detect violators. Ideally, it even informs the car owner of the result. As the car is constantly moving and can be going fast, all of this needs to happen in a very short amount of time. A super fast database lookup on the edge is key to avoid money loss and deliver good customer service. 

For a final example,  let us look at additive manufacturing. 3D printers use layering techniques with a variety of materials to quickly create custom designed parts. During the layering process, the controller needs to quickly and efficiently incorporate small changes in the environment (e.g. an increase in temperature) to ensure quality and accuracy of the part. Faster and more precise manufacturing is currently limited by the I/O throughput. With a fast database, the I/O throughput is higher, allowing for more complex and finite production.

In short: A superfast database is not a nice to-have, it is a must-have. The database speed a database brings out-of-the-box is critical for such an application.

 

The impact of database speed on Sales, Conversions, Retention (or at least, nerves) 

There is a reason Google forces companies to optimize their websites and mobile applications for performance: There is a wealth of research and evidence that suggests response rates of websites and mobile applications impact user behavior significantly.[6] Even more, there are several studies providing evidence that response rates impact actual buying behavior. [7] While there is less research on other digital applications like e.g. a desktop app or workplace software, some studies have shown that needing to work with slow applications decreases employee satisfaction and productivity. [8]

The impact of database speed on battery, CPU, hardware and related resources

Another hidden requirement typically is resource-efficiency with regards to CPU, RAM, Disc space and battery / electricity. For any application running in the cloud, these requirements are balanced in the backend as the cloud scales vertically. It “only” adds to cloud costs (and is a waste of energy – not to mention all the infrastructure / hardware enabling that waste). 

On the edge, you typically work with restricted devices, meaning you can only use the devices’ resources, which can be pretty limited. Therefore, inefficient applications can push a device to its limits, leading to e.g. slow response rates, crashes, and battery drain. Security is a very necessary cross-the-stack functionality that often impacts performance. While data that stays on the edge is challenging to hack, edge data needs to be protected just like data in the cloud.

How database performance impacts the business value of your IoT application

All applications on one device share the available hardware capabilities; resource allocation is managed by the operating system. Accordingly, the more resources an application or the database uses, the less resources are available for other uses. The faster a database executes its operations, the less CPU it uses, the less battery / electricity, and typically also memory. In practice that means there are more resources available on the device to run e.g. Edge AI or Edge ML applications.

database

From a business value perspective that means:

  • You can save on hardware costs (CPU, RAM, Disc, Memory, …): either do more on existing / chosen hardware, upgrade hardware later or choose smaller and thus less expensive hardware. 
  • You can save on energy and cloud costs: The more efficient, the less electricity, the less cloud costs. This can add up tremendously as projects scale.
  • You can add more features, deliver more functionalities, make your application more secure within a given environment. 
  • You can deliver a smooth, fast user experience, enabling applications that deliver in near-realtime. 

    In sum, it clearly impacts the cost structure and value you can deliver.

database performance business value

Database performance impacts business value, directly and indirectly

As projects scale in size and scope, hidden requirements like database performance often become clear. At scale, small issues like delayed data, or data volumes, become big headaches. Ideally, these sorts of requirements would be at the heart of the design stage of any project – and budgeted for at the beginning. The choice of database clearly has a huge impact on the business success of IoT applications.

[1] See https://www.weforum.org/agenda/2018/01/effect-technology-sustainability-sdgs-internet-things-iot/ for IoT impact on Sustainable Development Goals (SDG)
[2] https://restart-project.eu/much-know-industry-4-0/
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=13&cad=rja&uact=8&ved=2ahUKEwiGidSA6trnAhVQY8AKHTpSDUIQFjAMegQICBAB&url=https%3A%2F%2Fwww.mdpi.com%2F2076-3387%2F9%2F3%2F71%2Fpdf&usg=AOvVaw3cx44OOMfNzJ_BJlCG8Gfj
[3] Database Administration: The Complete Guide to Practices and Procedures By Craig Mullins 2002
[4] https://www.vxchnge.com/blog/the-5-best-benefits-of-edge-computing
[5] https://www.zdnet.com/article/why-autonomous-vehicles-will-rely-on-edge-computing-and-not-the-cloud/
[6] https://developers.google.com/web/fundamentals/performance/why-performance-matters https://www.thinkwithgoogle.com/intl/en-154/insights-inspiration/research-data/need-mobile-speed-how-mobile-latency-impacts-publisher-revenue/
https://www.machmetrics.com/speed-blog/how-does-page-load-time-affect-your-site-revenue
https://datadome.co/bot-management-protection/website-performance-how-to-increase-your-business-by-blocking-bots/
[7] https://developers.google.com/web/fundamentals/performance/why-performance-matters
https://www.thinkwithgoogle.com/intl/en-154/insights-inspiration/research-data/need-mobile-speed-how-mobile-latency-impacts-publisher-revenue/
https://www.machmetrics.com/speed-blog/how-does-page-load-time-affect-your-site-revenue
https://datadome.co/bot-management-protection/website-performance-how-to-increase-your-business-by-blocking-bots/
[8] https://drum.lib.umd.edu/handle/1903/1233
https://www.tandfonline.com/doi/abs/10.1080/01449290500196963

 

How EV Charging Benefits from Edge Computing

How EV Charging Benefits from Edge Computing

Edge computing allows data to be stored and used on local devices. Integrating Edge Computing directly within electric vehicle charging infrastructure improves station usability and also allows for real-time energy management.

Car charging and electric vehicles

The era of electric vehicles (EV) is coming: Already one in every 250¹ cars on the road is electric. While it is uncertain when electric vehicles will overtake traditional combustion engine vehicles, electric is clearly the future. Car charging infrastructure is critical for electric vehicle expansion – and one of the largest bottlenecks to EV adoption. Range anxiety is still one of the primary concerns for potential EV customers,² and charging station proliferation is still far behind traditional gas stations.

EV charging

State of the electric vehicle charging Market

The electric vehicle charging infrastructure market is still very divided, with many players vying for this large-growth sector – some predictions forecast over 40% CAGR for the car charging infrastructure market in the coming years.³ Car manufacturers, gas & oil, OEMs, and utilities companies (e.g. Tesla, VW, BMW, Shell, GE, Engie, Siemens, ABB) are actively taking part in the development of the market, recognizing the need to support future EV customers and the huge growth potential. Startups in the space like EcoG, Wirelane, flexEcharge and Elli offer solutions that focus on accessibility, efficiency and improving end user experiences.

Why Car Charging Stations need Offline Capability (Edge Computing)

First, let’s look at the challenges a vehicle charging provider needs to solve from a basic data perspective: Customers interfacing with charging stations require an account linked with basic information and payment methods. In order to charge a car, the user needs to be verified by the charging station, and is often required to have a pre-booked charging slot. Typically, a user would create a new account via a website or mobile phone beforehand, but not on the spot at the car charging station. Also booking slots are handled via a mobile app or website. However, the car charging station needs this information to allow a car to be charged.

This is only the most basic necessity. In the future, charging stations will provide more services to users, e.g. identifying users preference like cost over speed of charging, or choosing to charge with green energy. 

Depending on where the car charging station sits, it can be offline more or less often, e.g. in France there are quite many electric car charging stations in the country site, where the connection is typically flaky – and might not be available for days. On the other hand, there are stations that reside within a parking house or hotel and use a fixed land line for connectivity. In the latter case, your uptime can be very consistent, but typically you cannot guarantee the car charging station will be connected.

If the charging station tries to access this data only when it needs it, because a car is trying to charge, it may or may not have an internet connection at the time and thus the likelihood of failure is rather high. Accordingly, any new information should be pushed to the car charging stations when a connection is available and stored on the station. The hardware of a car charging station is capable enough to hold a lightweight database and persist data as is needed and useful.

EV charging edge computing solution

Choosing a data persistence layer (database) over a simple caching ensures not only that no data is lost, but can also allow more processing to happen on the station and allows for autonomous reactions. In combination with edge synchronization, which enables persistence layers to synchronize between car charging stations (that share a data space), fast data persistence allows for efficient load balancing as well as easily updating the configurations of all car charging stations.

 

Smart Energy Load Management – the need for fast response on the Edge

Managing energy is one of the greatest challenges for EV infrastructure providers. The difficulty here is less about overall energy consumption increasing – rather managing, predicting and preparing for high-demand peaks. Imagine everyone needs charging during a large public event, or at charging stations during holiday travel times – peak demands like these need to be anticipated and planned for. The future with electric cars needs to balance demand with a combination of smart chargers, efficient energy grid management, Vehicle-to-Grid (V2G) solutions, and perhaps even on-site batteries at larger charging stations to improve time-to-charge and optimize for electricity prices.

EV charging edge computing solution

Edge computing will play an important role in providing real-time, accurate energy load control, necessary for maintaining grid stability, particularly in emergency situations.⁴ At charging stations where many EVs plug in, smart edge nodes can balance charge schedules in real-time, optimizing based on EV owner requirements without overloading local transformers.⁵  On a larger scale, smart energy meters can use real-time edge computing to shift energy quickly to high-demand locations, cutting energy from low-priority appliances, limiting charge speeds, or pulling excess energy from V2G networks.

Thinking about energy management, the conversation fluidly moves from EV charging infrastructure to thinking about smart mobility, utilities, and smart city infrastructure on a larger scale. Car charging systems will be complex, interconnected and will progress in alignment with other ongoing digitization efforts to create data drive infrastructure across cities and the world. Edge computing, and base technologies like ObjectBox that enable working on the edge, are important enablers to ensure that real-time computing can happen anywhere and digitization is affordable, scalable, and sustainable.

What is Edge Computing?

What is Edge Computing?

Today, over 90 percent of enterprise data is sent to the cloud. In the next years, this number will drop to just 25 percent according to Gartner. The rest of the data is not going anywhere. It is being stored and used locally, on the device it was created on – e.g. cars, trains, phones, machines, cameras. This is Edge Computing – and since the Corona outbreak it is more relevant than ever.

Obviously, this is cutting the discussion short. With edge consortia springing up like mushrooms, there is no lack of overlapping definitions around the terms Edge Computing and Fog Computing.

what is edge computing

From mist to fog to edge to cloud – a short overview

To bring some light into the terminology mess: The terms “mist computing” and “cloud computing” constitute the ends of a continuum. In our definition, the edge covers everything from cloud to any end device, however tiny and limited it may be. In this definition, there really is only the cloud and the edge.

However, some authors additionally use the terms fog computing and mist computing.

Mist covers the computing area that takes place on really tiny, distributed, and outspread devices, e.g. humidity or temperature sensors. To make it a bit more tangible: These devices generally are too small to run an operating system locally. They just generate data and send that data to the network.

As opposed to mist computing, the cloud refers to huge centralized data centers. The terms “fog” and “edge” fall within this continuum and – depending on whose definition you follow – can be used interchangeably.

what is edge computing

From edge to cloud and back: History repeating itself

If these terms seem familiar to you, that is probably because edge computing is just another cycle in a series of computing developments.

Computing has seen constant turns between centralized and distributed computing over the decades, and with recent developments in hardware capacity, we’re again entering into a decentralized cycle.

edge vs cloud

Edge Computing has been around for 20 years, see a quick history here:

One to rule them all?

Neither the cloud nor the edge is a solution for all cases. As always: It depends. There are cases, where the edge makes more sense than the cloud and vice versa. Most cases however, do need both. If you can, putting the bulk of your computational workload on the edge does make sense though from an economic as well as environmental perspective.

 Interested in learning more? Read why Android developers should care about Edge Computing or discover Edge IoT use cases.

A last word on “edge consortia”

There is no lack of consortia defining terms around edge computing – it’s a lot like the Judean People’s Front against the People’s Front of Judea. After a year of battle, the most prominent edge consortium emerging currently seem to be EdgeX under the umbrella of the Linux Foundation – fully open source, while also largely supported and driven by Dell, who initiated it. Other notable players trying to get a foothold in this space is the Deutsche Telekom with MobiledgeX and HPE with Edge Worx. A European counterpart, ECCE, formed in spring 2019 and might be worthwhile watching, as it is supported by many industry players like e.g. KUKA, Intel, and Huawei.