pytorch-fsdp2
Adds PyTorch FSDP2 (fully_shard) to training scripts with correct init, sharding, mixed precision/offload config, and distributed checkpointing. Use when models exceed single-GPU memory or when you need DTensor-based sharding with DeviceMesh.
Packaged view
This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.
Install command
npx @skill-hub/cli install zechenzhangagi-ai-research-skills-pytorch-fsdp2
Repository
Skill path: 08-distributed-training/pytorch-fsdp2
Adds PyTorch FSDP2 (fully_shard) to training scripts with correct init, sharding, mixed precision/offload config, and distributed checkpointing. Use when models exceed single-GPU memory or when you need DTensor-based sharding with DeviceMesh.
Open repositoryBest for
Primary workflow: Analyze Data & AI.
Technical facets: Full Stack, Data / AI.
Target audience: Development teams looking for install-ready agent workflows..
License: Unknown.
Original source
Catalog source: SkillHub Club.
Repository owner: zechenzhangAGI.
This is still a mirrored public skill entry. Review the repository before installing into production workflows.
What it helps with
- Install pytorch-fsdp2 into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
- Review https://github.com/zechenzhangAGI/AI-research-SKILLs before adding pytorch-fsdp2 to shared team environments
- Use pytorch-fsdp2 for development workflows
Works across
Favorites: 0.
Sub-skills: 0.
Aggregator: No.