Your submission was sent successfully! Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

Accelerate your AI/ML projects with an MLOps workshop

Build your tailored MLOps architecture in just 5 days. Our workshop will help you design AI infrastructure for any use case – from the data center to the edge. Accelerate and scale your AI/ML initiatives, optimize resource usage and elevate your in-house expertise.


Why join an MLOps workshop?


Define your MLOps architecture

We will work with you to build high-level and low-level architecture based on your existing infrastructure for your AI/ML projects, using only open source tooling.

You can expect:

  • An end-to-end proposal, covering everything from the operating system to the data science and ML platform
  • Cloud-agnostic design
  • Cost-efficient solution
  • Scalable and secure architecture that runs from AI workstations to data centers to edge devices, depending on your AI/ML maturity

Learn from experts in the industry

Spend 5 days on site with Canonical experts who will help upskill your team and solve the most pressing problems related to MLOps and AI/ML.

  • Get guided, hands-on experience with the latest open-source ML tooling, adjusted to your use case and needs
  • Live Q&A session with AI and MLOps experts to help you accelerate time-to-market and unblock your projects
  • Access a library of written content to further upskill after the workshop

Make the best use
of existing infrastructure

Optimize your existing infrastructure to maximize efficiency and performance for your AI/ML workloads. Learn about:

  • Data collection improvement opportunities
  • GPU optimization techniques and strategies
  • Model packaging and distribution best practices

MLOps workshop structure

Pick one of our 4 workshops tailored to different use cases, and Canonical will help you design your architecture and long-term AI strategy.

Open source migrations

Build a migration plan to move your infrastructure to open source solutions to enable a long term strategy.


MLOps architecture design

Discover infrastructure options, MLOps processes and tools to make informed decisions about your stack.


Resource optimization

Stop underutilizing your compute power and learn how to optimize your resources across all layers of the stack.


Edge AI

Develop Edge AI architecture to efficiently run and maintain your models on a variety of devices.


You can also customize these workshops based on your needs, or even build your own bespoke agenda by selecting the topics that are most valuable for your organization.


Meet our experts

Your MLOps workshop is delivered by Canonical's MLOps field engineering team, a team of experts trained to architect, design and deploy AI infrastructure at all scales, across industries.

From GenAI use cases to AI sovereign clouds, the team has experience in building solutions in partnership with our customers.


Learn more