Back to skills
SkillHub ClubShip Full StackFull Stack

benchmark-kernel

Guide for benchmarking FlashInfer kernels with CUPTI timing

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
5,173
Hot score
99
Updated
March 20, 2026
Overall rating
A8.3
Composite score
6.9
Best-practice grade
B77.6

Install command

npx @skill-hub/cli install flashinfer-ai-flashinfer-benchmark-kernel

Repository

flashinfer-ai/flashinfer

Skill path: .claude/skills/benchmark-kernel

Guide for benchmarking FlashInfer kernels with CUPTI timing

Open repository

Best for

Primary workflow: Ship Full Stack.

Technical facets: Full Stack.

Target audience: Development teams looking for install-ready agent workflows..

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: flashinfer-ai.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install benchmark-kernel into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/flashinfer-ai/flashinfer before adding benchmark-kernel to shared team environments
  • Use benchmark-kernel for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

benchmark-kernel | SkillHub