In the fast-evolving landscape of artificial intelligence and data processing, NVIDIA has once again stepped into the spotlight with its groundbreaking platform, NVIDIA NIM (NVIDIA Inference Microservices).
NVIDIA NIM is designed to optimize the deployment and management of AI models in data centers, addressing key challenges faced by enterprises in scaling AI applications. At its core, NIM focuses on enhancing the inference process where pre-trained AI models make predictions based on new data by streamlining operations and maximizing hardware efficiency.
NIM leverages NVIDIA’s expertise in GPU acceleration to deliver high throughput and low latency for AI inference tasks. This is helping promote use of AI in real-world applications in various industries such as Healthcare, Automotive, Finance, Retail etc.
As AI continues to permeate every aspect of modern technology, the demand for efficient and scalable inference solutions will only grow. NVIDIA NIM is well-positioned to meet these demands.
AgilePoint on the other hand has a fast growing ecosystem of AI integrations providing tooling for operationalizing AI within your organization using its composable Metadata IT Abstraction Framework business process layer. The NVIDIA AgilePoint integration allows using NVIDIA NIM hosted models in AgilePoint Business Processes to quickly integrate pre-trained AI inference capabilities within your business processes which drive the core business of an organization.
Lets look at the NVIDIA NIM connector in action.
Please note that this video is recorded while this feature is in limited-preview beta version. Some of the labels and icons might change by the time it goes GA but the general concept of configuring the workflow activities would remain the same.