📺 Short of time? Jump to the Screencast:
The inception of the Internet of Things (IoT) technology has amplified the cloud computing paradigm by establishing the means to be able to distribute computing and processing activities away from the centralized cloud. With the broad explosion of smart devices and their compute processing capabilities, technologies and their supporting organizations are focused on reducing latency and providing computing/storing capabilities and analytic applications as close as possible to the devices.
This is critical to reducing latency and creating a much more “real-life” response time which is key to so many emerging technologies. Smart devices that are distributed across the globe cannot effectively depend on a centralized cloud to perform the cumulative analytical workload at the “speed of thought”.
A number of technologies have emerged in the last few year to provide a platform for edge nodes and devices to act on local data and to distribute analytical workloads and compute power across a broadly disseminated infrastructure, even with intermittent connectivity to a central cloud. Analytics and machine learning can effectively occur in close proximity to the devices that are generating the raw data, thus distributing workload and vastly reducing latency.
One key factor is that it’s not in the interest of many organizations, nor is it financially feasible for them, to host their own edge resources across the globe, with measured geographical distance from deployment areas, so that they can make sure that they are running their IoT solutions as close as possible to end-users.
OpenNebula’s ONEedge solution will provide organizations with the necessary tools to create their own private distributed cloud in which they can easily deploy and manage edge nodes—on demand, and leased on usage—in geographical locations that are in close proximity to IoT devices and applications. When our client plans an IoT deployment in a new region, they will be able to use OneEdge to determine where they will need edge resources to best service the devices or applications, so that they can allocate on-demand, deploy and control edge nodes based on the current demand at those specific geographical locations.
Now, thanks to our recent integration of AWS’ Firecracker as a new virtualization technology officially supported by OpenNebula, organizations will be able to base the distributed infrastructure they require for their containerized IoT applications on fast and secure microVMs that can be quickly deployed on-demand on edge resources from public cloud and bare-metal providers.
OpenNebula 5.12 comes now with seamless integration with the Docker Hub marketplace, which makes it even easier to deploy IoT solutions at the edge using Firecracker microVMs for your containerized applications.
Watch the following screencast to learn how you can use OpenNebula to deploy an IoT solution based on ThingsBoard, an open source IoT platform for collecting, managing, storing and visualizing data coming from multiple IoT devices. In this use case, edge nodes have been easily provisioned by using resources from third-party bare-metal providers, with edge applications being configured and deployed as containers running within Firecracker microVMs:
We have prepared a step-by-step guide based on our evaluation tool miniONE for everyone interested in reproducing this very same use case: just visit OpenNebula’s Customer Portal.
Cloud Technical Evangelist at OpenNebula