Edge AI in a 5G world

Alex Cattle

on 6 February 2020

Edge AI in a 5G world

Deploying AI/ML solutions in latency-sensitive use cases requires a new solution architecture approach for many businesses.

Fast computational units (i.e. GPUs) and low-latency connections (i.e. 5G) allow for AI/ML models to be executed outside the sensors/actuators (e.g. cameras & robotic arms). This reduces costs through lower hardware complexity as well as compute resource sharing amongst the IoT fleet.

Strict AI responsiveness requirements that before required IoT AI model embedding can now be met with co-located GPUs (e.g. on the same factory building) as the sensors and actuators. An example of this is the robot ‘dummification’ trend that is currently being observed for factory robotics with a view to reducing robot unit costs and fleet management.

In this webinar we will explore some real-life scenarios in which GPUs and low-latency connectivity can unlock previously prohibitively expensive solutions now available for businesses to put in place and lead the 4th industrial revolution.

Watch the webinar

Talk to us today

Interested in running Ubuntu Desktop in your organisation?

Newsletter signup

Select topics you’re
interested in

In submitting this form, I confirm that I have read and agree to Canonical’s Privacy Notice and Privacy Policy.

Related posts

What’s the deal with edge computing?

With over 41 billion IoT devices expected to be active by 2027 — that’s at least 5 devices for every person on the planet — edge computing has emerged as a...

Kubernetes for Data Science: meet Kubeflow

Deep Learning is set to thrive Data science has exploded as a practice in the past decade and has become an undisputed driver of innovation. The forcing...

ZFS focus on Ubuntu 20.04 LTS: what’s new?

Ubuntu has supported ZFS as an option for some time. In 19.10, we introduced experimental support on the desktop. As explained, having a ZFS on root option on...