Back to skills
SkillHub ClubAnalyze Data & AIFull StackBackendDevOps

nvidia-nim

NVIDIA NIM (NVIDIA Inference Microservices) for deploying and managing AI models. Use for NIM microservices, model inference, API integration, and building AI applications with NVIDIA's inference infrastructure.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
0
Hot score
74
Updated
March 20, 2026
Overall rating
C1.1
Composite score
1.1
Best-practice grade
N/A

Install command

npx @skill-hub/cli install rish2jain-paperresearchagent-nvidia-nim
apideploymentgpuai-modelsinfrastructure

Repository

rish2jain/paperresearchagent

Skill path: .claude/skills/nvidia-nim

NVIDIA NIM (NVIDIA Inference Microservices) for deploying and managing AI models. Use for NIM microservices, model inference, API integration, and building AI applications with NVIDIA's inference infrastructure.

Open repository

Best for

Primary workflow: Analyze Data & AI.

Technical facets: Full Stack, Backend, DevOps, Data / AI, Integration.

Target audience: Development teams looking for install-ready agent workflows..

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: rish2jain.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install nvidia-nim into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/rish2jain/paperresearchagent before adding nvidia-nim to shared team environments
  • Use nvidia-nim for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

nvidia-nim | SkillHub