Centralized data centers use a lot of energy and water, emit a lot of CO2, and generate a lot of electronic waste. In fact, cloud data centers are already responsible for around 300 Mt of CO2-eq greenhouse gas emissions [1]. And the energy consumption of data centers is increasing at an exponential rate [2].
This challenge is further compounded by the exploding demand for AI workloads. With AI adoption accelerating, the demand for data center capacity is projected to grow by over 20% annually, potentially reaching ~300 GW by 2030. Remarkably, 70% of this capacity will be dedicated to hosting AI workloads. Gartner predicts that without sustainable AI practices, AI alone could consume more energy than the human workforce by 2025, significantly undermining carbon-zero initiatives.
While more data centers are switching to green energy [3], this approach is not nearly enough to solve the problem. A more sustainable approach is to reduce unnecessary cloud traffic, central computation, and storage as much as possible by shifting computation to the edge. In our experience, just reducing data overhead and unnecessary data traversals can easily cut 60-90% of data traffic and thus significantly impact the CO2 footprint of an application, as well as costs.
Edge Computing stores and uses data on or near the device on which it was created. This reduces the amount of traffic sent to the cloud and, on a large scale, has a significant impact on energy consumption and carbon emissions.
Why do Digitization projects need to think about sustainability now?
Given the gravity of the climate crisis, every industry needs to assess its potential environmental impact and find ways to reduce its carbon footprint. The digital world, and its most valuable commodity, data, should not be any different. The digital transformation is ongoing and with it electronic devices and IT usage numbers are exploding. Thus, new apps must consider their carbon footprint throughout their lifecycle, especially resource use in operation and at scale [4].
Also, think about this: The share of global electricity used by data centers is already estimated to be around 1-1.5% [1] and data centers generate 2% of worldwide CO2 emissions (on par with the aviation industry) [5]. Recent analyses by Gardian suggests that the greenhouse gas emissions from the in-house data centers of major tech companies—Google, Microsoft, Meta, and Apple—are likely about 7.62 times higher than their official reports indicate. [6]. On top of this, providing and maintaining cloud infrastructure (manufacturing, shipping of hardware, buildings and lines) also consumes a huge amount of greenhouse gasses [7] and produces a lot of abnormal waste (e.g. toxic coolants) at the end of life [8].
Bearing that in mind, the growth rate for data center demand is concerning. The steady increase in data processing, storage, and traffic in the future, comes with a forecasted electricity consumption by data centers to grow by 10% a year [9]. In fact, estimations expect the communications industry to use 20% of all the world’s electricity by 2025 [10].
Shifting to green energy is a good step. However, a more effective and ultimately longer term solution requires looking at the current model of data storage, filtering, processing and transferal. By implementing Edge Computing, we can reduce the amount of useless and wasteful data traversing to and from the cloud as much as possible, thus reducing overall energy requirements in the long term. Of course, everyone can make a difference with their daily behavior and for developers that is especially true: Applying green coding principles helps producing applications that produce lower CO2 emissions over the whole app lifetime.
What is Edge Computing?
Until recently 90% of enterprise data was sent to the cloud, but this is changing rapidly. In fact, this number is dropping to only 25% by 2025, according to Gartner. By then, most of the data will be stored and used locally, on the device it was created on, e.g. on smartphones, cars, trains, machines, watches. This is Edge Computing, and it is an inherently decentralized computing paradigm (as opposed to the centralized cloud computing approach). Accordingly, every edge device needs the same technology stack (just in a much smaller format) as a cloud server. This means: An operating system, a data storage / persistence layer (database), a networking layer, security functionalities etc. that run efficiently on restricted hardware.
As you can only use the devices’ resources, which can be pretty limited, inefficient applications can push a device to its limits, leading to slow response rates, crashes, and battery drain.
EDGE DEVICE ARCHITECTURE
Edge Computing is much more than some simple data pre-processing, which takes advantage of only a small portion of the computing that is possible on the edge. An Edge Database is a prerequisite for meaningful Edge Computing. With an Edge Database, data can be stored and processed on the devices directly (the so-called edge). Only useful data is sent to the server and saved there, reducing the networking traffic and computing power used in data centers tremendously, while also making use of the computing resources of devices which are already in use. This greatly reduces bandwidth and energy required by data centers. On top, Edge Computing also provides the flexibility to operate independently from an Internet connection, enables fast real time response rates, and cuts cloud costs.
Why is Edge Computing sustainable?
Edge Computing reduces network traffic and data center usage
With Edge Computing the amount of data traversing the network can be reduced greatly, freeing up bandwidth. Bandwidth is a measure of the quantity / size of data a network can transfer in a given time frame. Bandwidth is shared among users. Accordingly, the more data is supposed to be sent via the network at a given moment, the slower the network speed. Data on the edge is also much more likely to be useful and indeed used on the edge, in context of its environment. Instead of constantly sending data strems to the cloud, it therefore makes sense to work with the data on the edge and only send that data to the cloud that really is of use there (e.g. results, aggregated data etc.).
Edge computing is optimized for efficiency
Edge “data centers” are typically more efficient than cloud data centers. As described above, resources on edge devices are restricted. Therefore, and as opposed to cloud infrastructure, edge devices do not scale horizontally. That is one reason why every piece of the edge tech stack is – typically and ideally – highly optimized for resource efficiency. Any computing done more efficiently helps reduce energy consumption. Taking into account the huge number of devices already deployed , the worldwide impact of reducing resource use for the same operations is significant.
Edge Computing uses available hardware
There is a realm of edge devices already deployed that is currently underused. Many existing devices are capable of data persistence, and some even for fairly complex computing. When these devices – instead – send all of their data to the cloud, an opportunity is lost. Edge Computing enables companies to use existing hardware and infrastructure (retrofitting), taking advantage of the available computing power. If these devices continue to be underused, we will need to build bigger and bigger central data centers, simultaneously burdening existing network infrastructure and reducing bandwidth for senselessly sending everything to the cloud.
Cloud versus Edge: an Example
Today, many projects are built based on cloud computing. Especially in first prototypes or pilots, cloud computing offers an easy and fast start. However, with scale, cloud computing often becomes too slow, expensive, and unreliable. In a typical cloud setup, data is gathered on edge devices and forwarded to the cloud for computation and storage. Often a computed result is sent back. In this design, the edge devices are dumb devices that are dependent upon a working internet connection and a working cloud server; they do not have any intelligence or logic of their own. In a smart home cloud example, data would be sent from devices in the home, e.g. a thermostat, the door, the TV etc. to the cloud, where it is saved and used.
If the user would want to make changes via a cloud-based mobile app when in the house, the changes would be sent to the cloud, changed there and then from there be sent to the devices. When the Internet connection is down or the server is not working, the application will not work.
With Edge Computing, data stays where it is produced, used and where it belongs – without traversing the network unnecessarily. This way, cloud infrastructure needs are reduced in three ways: Firstly, less network traffic, secondly, less central storage and thirdly less computational power. Rather, edge computing makes use of all the capable hardware already deployed in the world. E.g. in a smart home, all the data could stay within the house and be used on site. Only the small part of the data truly needed accessible from anywhere would be synchronized to the cloud.
Take for example a thermostat in such a home setting: it might produce 1000s of temperature data points per minute. However, minimal changes typically do not matter and data updates aren’t necessary every millisecond. On top, you really do not need all this data in the cloud and accessible from anywhere.
With Edge Computing, this data can stay on the edge and be used within the smart home as needed. Edge Computing enables the smart home to work fast, efficiently, and autonomous from a working internet connection. In addition, the smart home owner can keep the private data to him/herself and is less vulnerable to hacker attacks.
How does ObjectBox make Edge Computing even more sustainable?
ObjectBox improves the sustainability of Edge Computing with high performance and efficiency: our 10X speed advantage translates into less use of CPU and battery / electricity. With ObjectBox, devices compute 10 times as much data with equivalent power. Due to the small size and efficiency, ObjectBox runs on restricted devices allowing application developers to utilize existing hardware longer and/or to do more instead of existing infrastructure / hardware.
Alongside the performance and size advantages, ObjectBox powers on-device AI applications with its on-device vector database, which is optimized for handling AI workloads locally. This capability, coupled with the rise of small language models (SLMs), allows developers to shift AI processing from the cloud to the device.
ObjectBox’ Sync solution takes care of making data available where needed when needed. It allows synchronization in an offline setting and / or to the cloud. Based on efficient syncing principles, ObjectBox Sync aims to reduce unnecessary data traffic as much as possible and is therefore perfectly suited for efficient, useful, and sustainable Edge Computing. Even when syncing the same amount of data, ObjectBox Sync reduces the bandwidth needed and thus cloud networking usage, which incidentally reduces cloud costs.
Finally, ObjectBox’ Time Series feature, provides users an intuitive dashboard to see patterns behind the data, further helping users to track thousands of data points/second in real-time.
How Edge Computing enables new use cases that help make the world more sustainable
As mentioned above, there are a variety of IoT applications that help reduce waste of all kinds. These applications can have a huge impact on creating a more sustainable world, assuming the applications themselves are sustainable. Three powerful examples to demonstrate the huge impact IoT applications can have on the world:
Reducing Food Waste
From farm to kitchen, IoT applications can help to reduce food waste across the food chain. Sensors used to monitor the cold chain, from field to supermarket, can ensure that food maintains a certain temperature, thus guaranteeing that products remain food safe and fresh longer, reducing food waste. In addition, local storage can be used to power apps that fight household waste (you can learn how to build a food sharing app yourself in Flutter with this tutorial).
Smart City Lighting
Smart City Lighting: Chicago has implemented a system which allows them to save approx. 10 million USD / year and London estimates it can save up to 70% of current electricity use and costs as well as maintenance costs through smart public lighting systems [10].
Reducing Water Waste
Many homes and commercial building landscapes are still watered manually or on a set schedule. This is an inexact method of watering, which does not take into account weather, soil moistness, or the water levels needed by the plant. Using smart IoT water management solutions, landscape irrigation can be reduced, saving water and improving landscape health.
These positive effects are all the more powerful when the applications themselves are sustainable.
Sustainable digitization needs an edge
The benefits of cloud computing are broad and powerful, however there are costs to this technology. A combination of green data centers and Edge Computing helps to resolve these often unseen costs. With Edge Computing we can reduce the unnecessary use of bandwidth and server capacity (which comes down to infrastructure, electricity and physical space) while simultaneously taking advantage of underused device resources. Also with AI growing in popularity, Edge Computing will become very relevant for sustainable AI applications. AI applications are very resource intensive and Edge AI will help to distribute workloads in a resourceful manner, lowering the resource-use. One example of this is an efficient local vector database. ObjectBox amplifies these benefits, with high performance on small devices and efficient data synchronization – making edge computing an even more sustainable solution.
- https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks
- https://8billiontrees.com/carbon-offsets-credits/carbon-ecological-footprint-calculators/carbon-footprint-of-data-centers/
- https://www.datacenterdynamics.com/en/opinions/future-data-centers-green/
- https://www.datacamp.com/blog/environmental-impact-data-digital-technology
- https://www.theguardian.com/technology/2024/sep/15/data-center-gas-emissions-tech
- https://medium.com/@jaychapel/aws-vs-azure-vs-google-cloud-market-share-2019-what-the-latest-data-shows-dc21f137ff1c
- https://thereader.mitpress.mit.edu/the-staggering-ecological-impacts-of-computation-and-the-cloud/
- https://link.springer.com/article/10.1007/s12053-019-09833-8#ref-CR2
- https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/investing-in-the-rising-data-center-economy
- https://www.researchgate.net/publication/320225452_Total_Consumer_Power_Consumption_Forecast