Building a secure software supply chain is the challenge facing developers of tomorrow’s applications. Deep learning models in frameworks like PyTorch and TensorFlow require complex pipelines, composing from abundant resources. From training the models to testing, optimising, and deploying them on OCI images with OpenVINO™. This webinar will focus on the deployment on containers phase, building on the work done by Intel and Canonical to provide not only developer-friendly yet also secure and stable container images.
By combining Ubuntu containers with the OpenVINO™ Model Server, developers can deploy and manage inference-as-a-service on a range of Intel® platforms using Kubernetes – from the edge to the cloud.
- Miłosz Żeglarski, Deep Learning Software Engineer, Intel
- Ryan Loney, OpenVINO Product Manager, Intel
- Valentin Viennot, Product Manager, Canonical