Nvidia announces AI Workbench dev tool

3 min read
Nvidia announces AI Workbench dev tool

Head over to our on-demand library to view sessions from VB Transform 2023. Register Here

Fresh off its record $1 trillion valuation and rumors of a graphics processing unit (GPU) shortage, Nvidia today announced an all-new product for developers allowing them to build their own generative AI models from scratch on a PC or workstation.

Called AI Workbench, the new platform, announced today at the annual SIGGRAPH (Special Interest Group on Computer Graphics and Interactive Techniques) conference in Los Angeles, provides a simple user interface that runs on the developer’s machine and connects to HuggingFace, Github, Nvidia’s own enterprise web portal NVIDIA NGC, and other popular repositories of open-source or commercially available AI code. This allows a developer to access them easily without having to open different browser windows. Developers can then import the model code and customize it to their liking.

“You can work with these [AI models] and customize these right on your workstation, even your laptop,” said Erik Pounds, a marketing and product professional at Nvidia, in a call with VentureBeat. “That’s a huge thing: allowing … developers [to] work on these large language models and locally.”

AI Workbench “removes the complexity of getting started with an enterprise AI project,” according to Nvidia’s press release.


VB Transform 2023 On-Demand

Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.


Register Now

Buy-in from big partners

High-profile AI infrastructure providers, including Dell Technologies, Hewlett Packard Enterprise, HP Inc., Lambda, Lenovo and Supermicro, have already embraced AI Workbench, according to Nvidia, and see its potential to boost their latest generations of multi-GPU-capable desktop workstations, high-end mobile workstations and virtual workstations.

Moreover, developers with Windows- or Linux-based RTX PCs or workstations can now test and tweak enterprise-grade generative AI projects on their local RTX systems and access data center and cloud computing resources as needed.

“Workbench helps you shift from development on a single PC off into larger scale environments and even as the project becomes more mature, it will also help you shift your project into production,” said Pounds. “All the software remains the same.”

More in store

Alongside Workbench, Nvidia announced the latest version of its enterprise software platform, Nvidia AI Enterprise 4.0, which aims to provide businesses with tools for integrating and deploying generative AI models in their operations, but in a secure way, with stable API connections.

Among the features of AI Enterprise 4.0 are Nvidia NeMo, a cloud-native framework that enables end-to-end support for creating and customizing LLM applications, and the Nvidia Triton Management Service, which automates and optimizes production deployments. The system also includes Nvidia Base Command Manager Essentials cluster management software, which helps businesses maximize performance and utilization of AI servers across data center, multicloud and hybrid-cloud environments.

ServiceNow, Snowflake and Dell Technologies are all also announcing collaborations with Nvidia for new AI products.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Source link