Car Tolling – A case for Edge Computing

Car Tolling – A case for Edge Computing

Governments often face tight budgets on infrastructure development; car tolling is increasingly seen as the answer for raising funds¹, making it more and more prevalent. From 2008 to 2018 the total length of tolled roads in Europe increased by 23%² and tolling revenue in Europe increased by 37%³ to €31.3 bn. per year; similarly, from 2010 to 2015 the United States experienced a 63% increase in transponders and 52% more tolling revenue, resulting in $13.8 bn. in 2015. On top, despite car sharing efforts, car ownership and traffic is still increasing in many countries, e.g. Germany, France and India. Increasing amounts of traffic, devices, and data points bring current tolling solutions to their limits. Taking data to the edge in new and existing tolling solutions by adding a data persistence layer and synchronizing parts of the data can make tolling more efficient and reliable.

Setting the stage: a typical car tolling situation

A national infrastructure company has deployed several hundred car tolling stations all over the country. These stations automatically recognize passing cars by detecting licence plates, using visual recognition or wirelessly, e.g. by receiving data from an RFID transponder in the car. In order to ensure that only eligible cars are passing through the tolling station and violators are fined, it is necessary for the tolling station software to look up the gathered vehicle information – among millions of entries – as fast as possible. If the data look-up is not  fast enough, or the data on the roadsides/tolling stations isn’t up to date and in sync with the central data, the tolling station loses money.

“The importance of mobile apps is increasing for Kapsch TrafficCom so that we see ObjectBox’ edge computing database solution as an interesting future base technology for all types of mobility apps.”

Peter Ummenhofer

Executive VP Solution Management, Kapsch TrafficCom

Why edge computing and fast lookups are key to today’s car tolling systems

In general, modern nationwide tolling infrastructure consists of three systems: tolling stations operated by the respective agencies, central open road, also called mobile tolling, and central transaction clearing houses. Within this infrastructure, all data related to violators and other operational information needs to be synchronized between these three systems in a consistent way, with as little delay as possible. If this is not the case, together with other problems, car tolling system operators are faced with high monetary losses every day.

Challenges of today’s car tolling systems

Today’s car tolling systems are based on the fundamental idea that cars do not need to stop to be checked or charged. Thus, as the cars move quickly through the scanning area, the main challenge relates to the amount of data that needs to be searched within a very short time frame.  To be successful, the license plate needs to be read and looked up in a database in near real-time.

Near-realtime requirements

From a development perspective, this challenge is rooted in:

  • accessing data from a remote location (speed of communication, speed of network)
  • keeping data in synchronization with car tolling stations that are closer to the drivers and/or roadside units
  • database speed on remote servers
  • database speed on roadside units (car tolling edge devices)
  • limitations of existing hardware as some systems are quite old, and rolling out new hardware is expensive

Strict uptime quarantees

Furthermore, it is possible that stations shut down from time to time, due to the weather, power outages, vandalism or simply technical failures. However, tolling providers generally need to provide strict uptime guarantees and thus service level agreements often include penalty fees in case of excessive downtime. Such events cost the providers substantial amounts of money – and data loss, i.e. undetected violators, even more so.

Privacy and legal regulations

Adding to this, privacy and legal requirements differ from country to country and increase the complexity of the systems and timings. For example, in Austria the pictures and derived license plate information may only be used for checking, but in case no violation was detected, they need to be removed in an unrecoverable manner.¹⁰ On the other hand, the data of potential violators may be stored for the sole purpose of toll collection or prosecution, but only for a maximum of three years.

Edge Computing 

Edge computing (local data storage and data sync) can help solve these challenges. Deploying local persistence on every type of tolling station, i.e. open and static stations, as well as on the central server allows to meet the near-realtime requirements, heighten uptimes (offline, flaky networks), and last not least meet privacy regulations. From a technical point of view, a solution that supports all platforms and operating systems, is the most efficient approach to ensure edge persistence and data harmony across devices. 

Edge database and Sync are the center piece for efficient car tolling solutions

There are a couple of edge databases out there, but out-of-the-box data synchronization solutions are very rare. A fast edge database that reliably persists the needed data and supports fast lookups is essential. Data synchronization guarantees that the vehicle data in the internal stations’ memory is always up-to-date with the central server, so the station will make a decision based on the most accurate data every time. Additionally, the other systems involved in the tolling infrastructure consistently receive the most recent information with no further effort required.

Deploying such an edge persistence and data sync solution mitigates the losses of station shutdowns and Internet connection issues are not a problem anymore. The stations’ operating company also no longer looses violator’s information due to technical reasons.

Summary – Car tolling is moving to the edge

As this case study shows, the use of edge computing is a perfect fit for modern infrastructure. In the context of car tolling, speed, reliable data storage and synchronization are indispensable, resulting in ObjectBox being an effective solution for today’s and future technological advancements.

If you are interested in learning more, feel free to get in touch with us! We appreciate any kind of feedback.

billiger.de Mobile App Case Study

billiger.de Mobile App Case Study

Arne Jans

Arne Jans

Software Developer, solute

Vivien: Hi Arne, great to talk with you today. Let’s get started by learning more about you and billiger.de.

Arne: Hi Vivien. I’ve been doing software development for more than 10 years, and API design for the last 5 years. I am currently responsible for mobile development for billiger.de, the most widely used and award-winning price comparison portal in Germany. We’re especially proud of our data security, which was just recently awarded too.

The company behind billiger.de is solute GmbH, which is based in Karlsruhe. They also have a few other brands: shopping.de, an online shopping platform for men and women, and friends communication, an online marketing agency. At billiger.de we’re about 150 employees.

Some of our stats:

300,000+

active daily users on billiger.de

500,000+

app downloads

70 Million

prices in the database

22,500 Shops

comparing 1M products

So clearly, the database and its performance on the server side is very important. Companies update their prices all the time, and on top there are all kinds of vouchers that can be applied. All of these are changing frequently – and you never know who updates their prices when. So, you can see the challenges – from a technical standpoint but also for consumers. It is hard to get the best price.

V: Tell me more about the billiger.de app – why did you decide to go for a native app?

A: Well, to be honest there was an existing native app when I came into the company. But aside from that, it’s essential for UX. We also need some offline capability for features like the notepad function or when users are in the store without an Internet connection and scan barcodes. Once they are online again, the query goes to the cloud – and the user gets his result.

V: So are most of your users on the app? Or rather web?

A: We definitely still have more web users, but user numbers are shifting to mobile more and more. Also, our web users are often one time users only. Our loyalty rate is much higher with app users, so we are trying to increase app installs. We’re seeing that – even on the web – the majority of users are coming from mobile devices. Therefore, we relaunched the website a couple of years ago to be responsive and mobile optimized. So we are focusing more and more on mobile, on both the website and through the app. 

V: Why did you need to implement a local database? How is it implemented in your solution?

A: We need data persistence mainly for certain features. We’re still using SQLite, but it’s too much boilerplate code and too little fun. We have been using an ORM on top of SQLite until recently, but it didn’t work well in combination with Proguard on some Android versions anymore. So it resulted in lost data. We’re currently using ObjectBox in the billiger.de Pro version and in a fun new project called PricePretzel, which gives users the best price actively and tracks savings. In these projects, ObjectBox has proven its worth, so we want to migrate the billiger.de app too. 

V: Yes, SQLite with an ORM can get very messy. So, why did you choose ObjectBox as the alternative?

A: I looked at several SQLite alternatives and ObjectBox looked interesting. The main decision factors were: ease of integration, stability, and performance. But ease of use and integration were really the most important factors. Stability and enough performance were rather basic necessities. We found ObjectBox really easy to use – we did the migration and everything and because ObjectBox handles that automatically, it was really simple.

We found ObjectBox really easy to use – we did the migration and everything and because ObjectBox handles that automatically, it was really simple.

ObjectBox database mobile app case study
mobile app database case study
mobile app database case study

V: So did performance matter to you at all?

A: For our needs, performance was secondary. Obviously the performance needs to be good enough, but we do not have super high requirements regarding performance.

V: Do you do any sort of synchronization

A: Synchronization obviously is a super interesting feature and we are keeping an eye on it once it is publicly released. From the setup we have, we would need to do it with a connector to our existing database. Currently the web data and app data are separated and we are working on integrating them. So, this needs synchronization. 

V: Which other tools do you use in your solution/are you excited about?

A: Retrofit from Square, a networking library, we recommend it to everyone and it works super well with ObjectBox. Both libraries work well together with our business objects. Retrofit fetches the fresh data from our servers and deserializes it into our business objects, which are then persisted with ObjectBox without any additional boilerplate code.

V: billiger.de has over 500.000 downloads and about 4 stars on average – how many daily users does the billiger.de app have? Do you have peak times?

A: Obviously holidays like Christmas and Easter are busier. During the day, early evenings get the most traffic – about 1000-2000 daily active users in the billiger.de app, 200 in our Pro-app, and iOS is similar. As I shared before, we get about 300k daily users on the website.

V: Thanks for sharing, and for talking with me today. Any last words?

A: Thank you for having me! I am looking forward to do more with ObjectBox and am very excited about what comes next!

Offline-first – why Android app developers should care about Edge Computing

Offline-first – why Android app developers should care about Edge Computing

What is Edge Computing?

Today, over 90 percent of enterprise data is sent to the cloud. In the next years, this number will drop to just 25 percent according to Gartner. Where is the rest of the data going? It’s not going anywhere. It is being stored and used locally, on the device it was created on. This is Edge Computing.

Mist, fog, edge, cloud – the terms

To bring some light into the terminology mess: The terms “mist” and “cloud” constitute the ends of a continuum.

Mist covers the computing area that takes place on really tiny, distributed, and outspread devices, e.g. humidity or temperature sensors. To make it a bit more tangible: These devices generally are too small to run an operating system locally. They just generate data and send it to the network.

As opposed to this, the cloud refers to huge centralized data centers.

The terms “fog” and “edge” fall within this continuum and – depending on whose definition you follow – can be used interchangeably.

Android Edge Computing

Adapted from Peter Livine (Andresseen Horrowitz)

Why is the edge cycle happening again just now?

The underlying megatrend enabling this shift is cheaper and more efficient hardware, as well as the emergence of edge databases. Edge databases are specifically designed to run on the edge and therefore lean and efficient. Both the number of mobile and IoT devices, and alongside data volumes are exploding at exponential rates. At the same time computing capabilities on the edge level are advancing faster than those on the cloud level. So, the edge has increasingly more power – power that is currently often underused. 

New applications and requirements drive the shift to the edge: 

edge vs cloud

Advantages of setting your project up in the cloud

So, what about the cloud; do we still need it in this era of Edge Computing? Setting up your project in the cloud has some advantages: First of all, the setup itself is comparably easy, as the cloud servers are managed by another organization. This also means that you do not need to worry about scalability of your servers or data loss, i.e. the need for redundancy, reducing overall downtimes. Additionally, cloud systems also are well tested, automatically updated and often encryption mechanisms are provided natively. This eases up on administration.  Generally, you can use the cloud rather quickly without worrying about lengthy and error-prone setup tasks. Using servers also centralizes the logic: clients will just call a unified interface (e.g. Web/REST).

A typical IoT-setup would often be centralized and look like this today:

 

Advantages of running your application on the edge

Running an application on the edge, e.g. your Android phone, a Smart Home Server, or in the car, has a couple of advantages:

  • The application works everywhere, all of the time (offline / online)
  • Great supersmooth User Experience (UX) as the app can respond in (near) realtime
  • Data stays where it was produced and belongs, the user maintains data ownership
  • Cloud / Connectivity costs go way down

Some tangible use cases for edge computing:

  • Many Mobile Games run on the edge. As a game developer you really want your users to be able to engage with the game whenever they feel like it and have the time. And as a gamer, you may well want to play when offline, for example when commuting on the underground. Also, gamers really care about the user experience with very smooth animations and high-fidelity visuals.
  • Autonomous driving as well as any human safety application needs to work independent from an Internet connection and in realtime. Imagine crashing because the car was trying to connect to the cloud or still waiting for the database to respond.
  • Smart home or smart health applications should work even when there is no connection, but moreover: Why should you personal health data leave your private space? You probably would want to own that data and keep it safe in the local environment. That way it is much less likely that s.o. will try to hack your individual data versus millions of centrally stored data.
  • Predictive maintenance apps usually need to process tons of data or high-fidelity data like video streams. Transferring all this data to the cloud usually means such high cloud costs that the project becomes unprofitable. Therefore, they usually are run on the edge and only aggregated data transferred to a central server.
  • In Industry 4.0 / smart factory / Industrial IoT (IIoT) settings you often lack connectivity, so applications need to run on the edge.

Is the edge eating the cloud?

Unlikely. Often you want some data accessible from anywhere. Synchronizing parts of the data to the cloud (or an on-premise central server) allows you to combine many of the advantages from edge and cloud computing. Thus, the edge is a natural extension of the cloud that makes applications all the more powerful. We believe that future scenarios will often look like this:

 

Android Edge Computing

 

This line of reasoning is supported by the fact that all major cloud companies, e.g. Amazon, Google, Microsoft, are pursuing an edge strategy. 

Why should Android developers care about Edge Computing?

Edge computing and improved speed for Android, Android libraries and related products for developers were two clear sub-themes at the 2019 I/O conference – obviously low latency is a major competitive advantage.

Here is why designing your app to run on the edge will help you be successful on the Play Store: There are roughly 2.1 mill. Android Apps to choose from today in the Play Store. To stand a chance in that market you need to delight your users and get good app store ratings. Edge Computing delivers on the app traits users care about most: performance, security, and availability.

Users care about performance – a lot

Whenever an app responds to a query directly instead of taking a round-trip to the cloud and back, it should be faster. More importantly, you can measure and optimize more reliably, as the latency is independent from the network connection. This enables the fast high-quality digital experience consumers want.

Android Edge Computing
  • Reliable performance was found to be the second most important trait for app users in a study by PacketZoom.
  • Most mobile users, namely 96%, say app performance, such as speed and responsiveness, is important for them.
  • A study by appdynamics found that more than eight out of ten respondents had deleted or uninstalled at least one mobile app because of performance issues.
  • The same study found that 44% of respondents closed the app when experiencing poor network performance, but even worse: 32% even uninstall the app altogether. They also found, the reverse is true for fast performing and reliable apps with usage increasing.

As an Android developer you also know that many consumers cannot or do not care, why an app is not working. If the application is not working or just responding very slowly, users are dissatisified and annoyed. Therefore, as the developer you need to make sure your app always performs well.

“When your app depends on a network, latency is out of your control.”

Now, you might need to query data from the network. That’s fine; if most of your app is running independent from a connection, there are tons of ways to optimize user experience for connectivity loss and network latency.

Security is a hot topic and can be a USP

Users care about security, and this is a trend that will – in the face of the loss of huge amounts of personal data by tech giants – only continue to grow. When you leave data at the edge, on the device of the user, data security is much easier to provide. On top, data ownership is clear, easing up on data privacy.

Data is much more secure stored in one place  than when transferred over the network – possibly again and again and again. Android provides a good basis for keeping internally stored data safe. If data security and privacy are important for you, your app, or your users, think about keeping data locally and then only synchronizing data you really need accessible from anywhere. Last not least, while an individual phone may be hacked, it is less likely to occur and only an individual dataset is compromised (as opposed to millions of datasets).

 

Offline first – deliver an always-on-feeling

Users do not care about connectivity, they simply want to use the application when they want to – at home, in a department store, in a train, on a flight, on vacation. And when they can’t access an app, their user experience is bad. Even today in a highly connected world, there are lots of times where people have no connectivity or need to switch data off to save battery. The app that supports them in these times ;), is the one they will love and use

The most important advantage of doing an offline-first app is the availability of the app. Google translate is a great example of an app that you want to be offline-capable. Chances are that when you need it most you are in a place where you do not have a (affordable) connection. But you might also appreciate being able to read and search through your mails when you are on a plane too. Or type WhatsApp messages that go out when connected again, or just enjoy a round of Subway Surfers.

Offline-first apps make it possible to move content off the server and onto the phone. If an app only has to go to the server when it needs to, rather than all the time, it will be faster and more reliable. This is particularly significant where content doesn’t change often, but users require fast access.

The improvements in Android Q to Android’s Neural Networks API (NNAPI) and the size reduction of the model means Android phones are Edge AI ready and open tons of new possibilities for fast apps running on the edge.

With the recent changes in the Play Store rating, boosting the technical performance of your app will have an even greater impact than before.

But what about 5G?

First of all the 5G rollouts and uptake still will take some time, but if you are reading this you are building a business now. Secondly, while it will bring a faster network connection to many areas, there are caveats to a central cloud-based application that won’t change:

  1. When you are mobile, at some point you will be offline or your connection flaky.
  2. Storing and transferring data to the cloud is costly.
  3. Storing and transferring data unnecessarily to the cloud is wasteful.
  4. Storing data centrally yields higher security risks for your user’s data; transferring data is an additional security risk. Any data you can just keep locally is safest.
  5. If you leave the data with the user, data ownership is clear and you do not need to worry about privacy.

How do I bring my Android app to the edge?

As an Android developer, chances are, you are already doing Edge Computing in many of the apps you are developing.

First of all, offline-first does not work with typical web pages. Usually, you would go for a native app, or alternatively, a progressive web app (PWA) or similar technology. Apart from multiple UX benefits and speed, most users prefer native apps and users still spend 80% of their mobile usage time in apps.

Secondly, for an offline-first architecture you need a local storage as a primary source of data, e.g. a database. Changes to data are made in this layer. Application also can and usually do have networking components to synchronize data to a server. However, this connection to the backend is mainly used in the background to synchronize the local database.

So, why does not everybody do it all the time? Well, there are use cases where it does not make sense to go the extra mile for the limited functionality that edge computing would bring, e.g. in a parking app. Obviously, most relevant data points like your car and the availability of parking spots are changing all the time. So, you really are dependent upon a constant connection, or rather several constant connections: From the spots to the cloud and from the cloud to the app. However, there is another reason: Offline-capable apps are hard. That is why we developed ObjectBox Sync.

 

Does any app need to run at the edge? ⚖️

As always, there are some cases where the cloud makes perfect sense, indeed is the only option – and the same is true for the edge. You need to assess what you want to achieve and where the value lies for your application and users.

Sustainable Computing: Why the edge is saving the world ?

If you do not need to push all the data to the cloud, where large chunks of it might not even be used, you might want to take a step back and consider the broader picture: What do all these billions of mobile and IoT devices (that are quite capable) do while they wait for the cloud to respond? Nothing.

Sending data to the cloud unnecessarily is wasteful in two respects: Use of bandwidth and server capacity (which comes down to infrastructure, electricity and physical space) and the big waste of underused resources.

ObjectBox on Azure Sphere: Efficient IoT data persistence

ObjectBox on Azure Sphere: Efficient IoT data persistence

Listening to our IoT users, we implemented ObjectBox support for Microsoft’s Azure Sphere. With this extension, you can use ObjectBox on tiny devices now. But let us explain a bit more…

ObjectBox Azure Sphere

What is Edge computing?

Centralized computing entails a central computer storing and processing all data with multiple machines (clients) accessing it. Decentralized computing has no central instance and data is stored and processed on the machine it is used on. The currently predominant computing paradigm, namely cloud computing, is centralized.

The Internet of Things (IoT) is pushing the industry once again towards a distributed computing paradigm. In this context it is called Edge computing. Edge Computing aims to store and process data on end devices (so called edge devices or nodes) like smartphones, routers, and the IoT end devices. We view Edge Computing as an extension of the cloud, adding value and functionality on the edge of the network. 

Note: Fog Computing and Edge Computing definitions vary and overlap widely. This is just the definition we use.

What is the Azure Sphere?

The Azure Sphere is foremost an operating system for “small chips”, or more exactly, Internet-connected microcontroller units (MCUs). It was developed by Microsoft for Internet of Things (IoT) applications and comes with integrated cloud security services. As of today, it runs on a MT3620 MCU produced by MediaTek in collaboration with Microsoft.

Microsoft Azure, Microsoft’s cloud solution is closely related to the Azure Sphere. Security and user management, configuration and deployment can be analyzed and modified using that web interface.

ObjectBox on Azure Sphere 

There were a couple of reasons why we at ObjectBox support Azure Sphere:

Furthermore, we looked at the lifetime costs: Firstly, we chose Azure Sphere, because it can save maintenance costs. Secondly, because there is one unified interface to the platform, the platform itself may be used for any task imaginable (e.g. facility management, real time inventory, etc.). Thirdly, Microsoft’s security solution provides Over-the-air (OTA) updates. Therefore, it takes care of keeping the operating system up to date for you.

Azure Sphere use cases

These use cases exemplify a key consequence of using the Internet of Things in everyday devices: they may not only read and analyze sensor data, but also control the machine they are attached to, even autonomously. In connection with intelligent algorithms, these devices are able to make far-reaching decisions and thus maximize overall efficiency.

Benefits of ObjectBox on Azure Sphere

ObjectBox can greatly simplify the process of data collection, transmission, and processing. Let’s now see how ObjectBox is able to solve common problems encountered when integrating IoT into any kind of environment.

Let’s now see how ObjectBox is able to solve common problems encountered when integrating IoT into any kind of environment.

Scalability, i.e. integrating new devices into a fleet of existing ones, can be challenging because of the gigantic amount of data it generates and that must be transferred to a high-level entity. ObjectBox’s speed advantage provides a solution to this. Confirmed by 3rd party reviewers, ObjectBox outperforms alternatives in all areas. Thus, it offers higher rates for data transmission, storage and retrieval.

ObjectBox is created from developers for developers. Because ObjectBox’s programming interfaces are exceptionally easy to use, development time can be minimized and first prototypes can be delivered after a very short time.

Additionally, it is necessary to make sure data is always up-to-date and prevent unintentionally storing redundant or meaningless data. Our synchronization feature will solve that out-of-the-box for you.

Find the full technical description and download on GitHub.

Let us know what you think

Last not least, we are always happy to hear from you. Post any questions you may have on stack overflow tagged ObjectBox. Please share your thoughts on ObjectBox on Azure Sphere with us via Twitter, Facebook, or Mail (contact [at] objectbox . io).

EdgeX Interview: Why open source is key for IoT and Edge Computing

EdgeX Interview: Why open source is key for IoT and Edge Computing

EdgeX Foundry is a free open-source alternative IoT-platform, which has seen wild success in the last two years. Indeed within 6 months the platform has grown its adoption from 1 million to 5 million instances being used in early 2020. By the end of August 2020, EdgeX Foundry hit more than 7 million container downloads, is supported and used by many of the relevant IoT players, e.g. Dell, ubuntu, hp. The platform plans to release EdgeX 2.0 (called “EdgeX Ireland”) in the spring of 2021 with an extensive set of new features.

Two years ago, when EdgeX was still a rather unknown open source initiative, we spoke to the initiators,  Michael Hall, Brett Preston (the Linux foundation) and Jim White (Dell), about their ideas and vision of the future. While the platform has since grown a lot, it still endorses these original ideas and the spirit of openness. Read here how it all started:

Jim White

Jim White

Dell, EdgeX

Jim is Vice Chair of the Technical Steering Committee of EdgeX. He is also Team Lead of the IoT Platform Development and IoT Solutions Division at Dell.
Michael Hall

Michael Hall

The Linux Foundation for EdgeX Foundry

Michael is Developer Advocate and Community Manager at the Linux Foundation.

 

Brett Preston

Brett Preston

The Linux Foundation for EdgeX Foundry

Brett is Developer Advocate and Community Manager at The Linux Foundation for EdgeX Foundry.

 

Vivien: What is EdgeX and where are you at with it?

Michael: EdgeX Foundry is a vendor neutral open source platform containing a collection of micro services that take care of different aspects of what you’re going to need to have an edge computing platform. If you’re making IoT devices, you don’t want to reinvent that layer of the stack. Having that common platform for IoT is something that is going to benefit everybody. The Linux Foundation is a neutral umbrella over EdgeX. Inside the project are all the member companies who are actually funding the development, putting developers and marketing resources in, to make it an actual, usable product for everybody. That’s the model of the project and the actual code itself. The main goal of EdgeX is processing and transporting data between IoT devices and sensors and things in the cloud and on the backend. The focus is on being able to respond locally as much as you can, so that you don’t have the latency of going on the cloud and back. And also, being able to continue working, if you loose that connection.

Jim: EdgeX is a an open source platform containing a collection of micro services that take care of different aspects of what you’re gonna have to do to have an edge computing platform. Any of those are semi-dependent: You can replace anything you need to replace. For the status of the project: We just had a year of fast pace growth and we have rewritten everything in Go, so all of our processes are a lot smaller and more efficient now.

EdgeX Open Source for IoT & Edge Computing
Vivien: How are you currently tackling local on-device data persistence?

Jim: We currently use Mongo DB as the persistence engine although we could support almost any kind of persistence store at the edge as long as it was small enough. We also have used SQLite in the past for a couple of customers. However, Mongo DB is the largest element in our portfolio of services. There are a couple of reasons why we are probably going to offer an alternative to Mongo with our next big release in spring 2019: Footprint, licensing, and lack of support for ARM32.

Michael: As we are a collection of microservices, you can always swap out individual pieces depending on what your needs are.

Vivien: Is EdgeX and its components restricted to certain licenses?

EdgeX builds on Apache 2

Jim: EdgeX is an Apache 2 license open source project, so we prefer Apache 2 level or at least a compatible license, because we want to be very business friendly. We want people to take the application and use it in all sort of settings, including actually embeddeding it in gateways. We also want to be very decoupled at discrete points.

For example, if I’m a company like Dell and I use EdgeX. If some of my customers have an absolute demand that a certain database be at the heart, then I want to be able to choose the database, depending upon the customers, the use cases, and the environment that they find themselves in. EdgeX is all about the flexibility. So, for this example, we offer what we call a reference implementation database. Customers or users could take EdgeX and replace elements with their own technology, which may not be open source even.

Michael: You can take what’s open source and add proprietary file systems or hardware depending on what your specific needs are. EdgeX tries to be that common open source base. It provides all of the functionality in a open source license but that still lets you replace bits as needed with whatever it is that you want to run.
Vivien: Can you give us an example how and where EdgeX is currently used? 

Jim: There are over 70 companies now that are part of the EdgeX community and each group is using it differently. There are some that are serving EdgeX as the Red Hat model: they are providing distribution, services and support behind EdgeX. A company like mine, Dell, we’re trying to find a platform that actually goes on our gateway. So we’re going to build a commercial version of EdgeX for our own platform. There will be pieces that we will replace based on better performing mechanism to some of our cloud based products. Then you have other groups out there that are proving particular services for EdgeX, for example edge analytics. There are lots of different service capabilities where we see potential replacements. Then there are companies like Samsung that uses EdgeX in their factory floor to help run their automation. So, they are users, but they also want to make sure  EdgeX meets their needs. Our community is made of snowflakes, they are all very special *laughs* – common goals but different use cases for almost everybody that is part of the organization.

 

Vivien: That sounds really cool. In your opinion, moving the data to the edge, what is the edge, where do you see the data ending up, for example more on the sensor level or gateway level?

Latency concerns, cost of shipping up the data, and the ability to actuate locally are key reasons why you have to have edge software.
Jim White

Vice Chair of the Technical Steering Committee, EdgeX

Jim: We absolutely believe EdgeX is a mechanism for the edge. While you could run pieces of EdgeX on the cloud, we do not believe is what the future holds. There are gonna be certain use cases where that works, but latency concerns, the cost of shipping up the data, and the ability to actuate locally are all key ingredients and reasons why you have to have edge software and edge platforms. Now, these are gonna get smaller. At Dell, we are manufacturing gateways of different sizes, because we know that certain use cases are gonna dictate a larger box and other are going to dictate something like a Raspberry Pi or even smaller. We have companies in our foundry that are looking at running parts of EdgeX in things like PLCs, to help address their realtime needs. So, we absolutely believe that the edge is very much going to be a part of our IoT environments. There are going to be use cases that dictate different levels of compute all the way up from sensors to cloud.

 

Michael: And all of our member companies see a need for that platform, but that platform is not going to be their product or their service. So everybody wants it to exist, so everybody is gonna work together to make it exist, so that they can build their own value-add on top of that or below their device level. Everyone agrees that this is an important thing, that we have to have a solution there for all the innovation that people see on the horizon.
Markus: So, small device level versus gateway – would you say your current focus is on the gateway?

cloud vs edge

Levels of Edge Computing

Jim: I would say that it really isn’t that one or the other is more important. You’re gonna have situations, as we know from a Dell perspective, where what we call a brownfield device (e.g. a 1979 modbus engine) needs a gateway, because it doesn’t have the ability to communicate into any kind network otherwise. So there has to be a gateway that provides that first level of compute. There are other things that are evolving in the industry: think of say windmill generators, where there is lots of capability right there at the device level, there is a lot of compute built right into those systems. So things will run at that level, and then you have everything in between. Even something like BLE or Zigbee type of environments where there is wifi and ability to connect directly to a network. Typically, we’re finding organizations are reluctant to allow those kind of things to connect into their major networks without some security, apparatus and analytics to see what’s going on, so as not to create problems in their larger networks. So even there, a gateway may be necessary, not because of hardwiring or physical connections, but because you want some insurances in place at the edge before that data leaks on up to your enterprise.

It’s the worst way to build our product except for all others.
Jim White

Vice Chair of the Technical Steering Committee, EdgeX

Vivien: What’s the worse about open source that you’ve experienced?

Jim: *laughs* Now you are going to make me say some things in front of Brett and Michael as members of the Linux foundation… There is a quote by Winston Churchill that talks about democracy, saying it’s the worst form of government except all others. I kind of feel the same way about open source development. It’s the worst way to build our product except for all others. Because it does take time. It’s a community effort and anything done by a community automatically seeks a ground where it’s going to be the best and brightest product. So you get the best input from everybody, but it takes time. It’s easier for say something like Dell to go marching off and build a software solution that they think is the best. It will get there faster but it’s not necessarily going to get there in a way that the world and communities accept more easily. So anything built by many hands is going to take a little bit more time and a little bit more process. But it ends up getting a lot better results I think in the end.

Michael: Whenever you have a community building something you can’t just come in and say “This is what you’re gonna build” because they don’t have to do what you say. And that’s true even with EdgeX. Everybody who is working on it is working for a company invested in it, but there is no one person who can say this is what you’re all going to do. So it’s not enough to say just what you want done. You have to explain and justify why and get people to buy into that. And that takes more effort, but you have to know that what you’re proposing is the right solution, that it’s going to work. If you can’t explain that, you can’t communicate that to the community then it’s not going to get done. As Jim said, it takes time but at the end product is going to be better.
Jim: In this case with IoT, I will tell you that no one company will be able to provide it all. As Dell, we would love to be the company providing it all… (laughter) We have learnt the hard way that in an IoT landscape there are going to be certain things in the company that you can’t touch and IoT has to touch everything. Maybe it’s the network, hardware or operating systems, particular sensors and protocols. You can help to persuade customers to do some things in your way, but you’re never going to be able to get them to do everything in your way and that’s why IoT takes an ecosystem. Which is why we think the second part of EdgeX is so important; our product is important, but just as vital is the ecosystems. We have a collection of companies all trying to work together to provide for interoperability. That is just as important as the actual end product we develop.
No one company will be able to provide it all.
Jim White

Vice Chair of the Technical Steering Committee, EdgeX

Vendor lock-in is not going to work in IoT.
Michael Hall

Developer Advocate, Linux Foundation

Michael: Vendor lock-in is not going to work in IoT. There is no way any company is gonna be able to provide all the needs of somebody. So having an equal playing field for everybody, having that common ground that anybody can come to and interact with anybody else, is what’s going to allow us to fulfill the promise of IoT in general.

ObjectBox DB is an embedded database solution, smaller than 1MB and 10x faster than any alternative. Objectbox EdgeX combines ObjectBox’ speed and ease of use with EdgeX Foundry™ IoT Edge Platform. EdgeX Foundry users can now compute millions of data points on the edge with minimal latency.