×

Accelerate AI development with Ubuntu and NVIDIA AI Workbench

Accelerate AI development with Ubuntu and NVIDIA AI Workbench


Fig.1. NVIDIA AI Workbench

Canonical expands its collaboration with NVIDIA through NVIDIA AI Workbench. NVIDIA AI Workbench is supported across workstations, data centres, and cloud deployments.

NVIDIA AI Workbench is an easy-to-use toolkit that allows developers to create, test, and customise AI and machine learning models on their PC or workstation and scale them to the data centre or public cloud.  It simplifies interactive development workflows while automating technical tasks that halt beginners and derail experts. Collaborative AI and ML development is now possible on any platform – and for any skill level. 

As the preferred OS for data science, artificial intelligence and machine learning, Ubuntu and Canonical play an integral role in AI Workbench capabilities. 

  • On Windows, Ubuntu powers AI Workbench via WSL2. 
  • In the cloud, Ubuntu 22.04 LTS enables AI Workbench cloud deployments as the only target OS supported for remote machines. 
  • For AI application deployments from the datacenter to cloud to edge, Ubuntu-based containers are included as a key part of AI Workbench.

This seamless end user experience is made possible thanks to the partnership between Canonical and NVIDIA.

Define your AI journey, start local and scale globally

Create, collaborate, and reproduce generative AI and data science projects with ease. Develop and execute while NVIDIA AI Workbench handles the rest:

  • Streamlined setup: easy installation and configuration of containerized development environments for GPU-accelerated hardware.
  • Laptop to cloud: start locally on a RTX PC or workstation and scale out to data centre or cloud in just a few clicks.
  • Automated workflow management: simplified management of project resources, versioning, and dependency tracking.
Fig 2. Environment Window in AI Workbench Desktop App

Ubuntu and NVIDIA AI Workbench improve the end user experience for Generative AI workloads on client machines

As the established OS for data science, Ubuntu is now commonly being used for AI/ML development and deployment purposes. This includes development, processing, and iterations of Generative AI (GenAI) workloads. GenAI on both smaller devices and GPUs is increasingly important with the growth of edge AI applications and devices. Applications such as smart cities require more edge devices such as cameras and sensors and thus require more data to be processed at the edge. To make it easier for end users to deploy workloads with more customisability, Ubuntu containers are often preferred due to their ease of use for bare metal deployments. NVIDIA AI Workbench offers Ubuntu container options that are well integrated and suited for GenAI use cases.

Fig 3. AI Workbench Development Workflow

Peace of mind with Ubuntu LTS

With Ubuntu, developers benefit from Canonical’s 20-year track record of Long Term Supported releases, delivering security updates and patching for 5 years. With Ubuntu Pro, organisations can extend that support and security maintenance commitment to 10 years to offload security and compliance from their team so you can focus on building great models. Together, Canonical and Ubuntu provide an optimised and secure environment for AI innovators wherever they are. 

Getting started is easy (and free).

Get started with Canonical Open Source AI Solutions



Source link