AI on Ubuntu: the path
The best data science and MLops tools across your infra
Develop AI models on high-end Ubuntu workstations. Train on racks of bare-metal or public clouds with hardware acceleration. Deploy to cloud, edge and IoT. All on Ubuntu.
Leaders in artificial intelligence choose Ubuntu
Best of breed data science tools.
The most productive tools for AI / ML development, with guides and resources available.
Multi-framework model serving.
Effective model deployment across devices mesh. Low-latency inference serving.
Infrastructure for production data science.
Centralised or multi-cloud training infrastructure for better resource allocation and data governance.
Analyse epic amounts of data, wherever it is.
Build large-scale data lakes optimised for machine learning on bare metal, virtual or cloud infrastructure with open source.
Drivers, storage, networking, CPU, GPU, DPU.
Enjoy full control over your firmware, in a safe environment, tested by millions.
Portable to scale
Give your workloads consistency everywhere.
Portable from desktop to vast multi-clouds. Fast-deploy on every major public cloud with GPU acceleration.
Kubeflow AI and MLOps at any scale
Enterprise-ready Charmed Kubeflow, the fully supported MLOps platform for any cloud
- A complete solution for sophisticated data science labs. Upgrades and security updates - all supported in the free, open source distribution.
- Get up and running fast with Charmed Kubeflow on MicroK8s, from lab to large scale. Highly available with 3+ nodes.
- Or deploy on Azure, Amazon AWS and Google cloud Kubernetes services with full support available from Canonical - the experts behind Charmed Kubeflow.
Canonical AI services
Enterprise support, deployment, training and fully managed Kubeflow
- Rely on 24/7 enterprise support with guaranteed SLAs.
- Provide specialised MLOps training to your sysadmins, devops engineers and data scientists. Any infra and level of expertise, tailored to your data.
- Off-load the complexity of Kubeflow deployment and management to Canonical engineers.
MLops for devices and Micro clouds
Industrial grade data pipelines from cloud to edge
- High-throughput inference at the edge with fast model updates.
- Build your AI/ML on top of the most reliable edge infra.
- Ubuntu Core for IoT, MicroK8s for zero-ops K8s with high-availability, Kubeflow for inference and distributed training.