EveryMCP

MCP Server

NVIDIA NIM MCP

Run NVIDIA NIM microservices for optimized LLM, vision, and speech inference.

AI & ML ToolsCloudAISource: nvidia
Author
nvidia
Repository
https://github.com/nvidia/nim-mcp

Installation

Set NVIDIA_API_KEY from build.nvidia.com; select model endpoint.

Use Cases

  • LLM inference
  • Vision models
  • Speech-to-text

Tags

nvidianiminferencegpu

Need Implementation Help?

We can integrate NVIDIA NIM MCP into your production stack, wire auth and policies, and ship a maintainable MCP setup.

View implementation service