Back to skills
SkillHub ClubRun DevOpsFull StackBackendDevOps

migration-test-and-push

Unified database migration skill. Use when user says "test migration", "push migration", "deploy migration", "reset local database", or "db push". Promotion pipeline local → staging → production with approval gates at each stage.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
30
Hot score
89
Updated
March 20, 2026
Overall rating
C2.5
Composite score
2.5
Best-practice grade
B73.6

Install command

npx @skill-hub/cli install enbyaugust-zacs-claude-skills-migration-test-and-push

Repository

enbyaugust/zacs-claude-skills

Skill path: skills/migration-test-and-push

Unified database migration skill. Use when user says "test migration", "push migration", "deploy migration", "reset local database", or "db push". Promotion pipeline local → staging → production with approval gates at each stage.

Open repository

Best for

Primary workflow: Run DevOps.

Technical facets: Full Stack, Backend, DevOps, Testing.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: enbyaugust.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install migration-test-and-push into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/enbyaugust/zacs-claude-skills before adding migration-test-and-push to shared team environments
  • Use migration-test-and-push for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: migration-test-and-push
description: Unified database migration skill. Use when user says "test migration", "push migration", "deploy migration", "reset local database", or "db push". Promotion pipeline local → staging → production with approval gates at each stage.
allowed-tools: Read, Edit, Bash, AskUserQuestion, Skill
version: 4.0.0
---

# Migration Test & Push

> Promotion pipeline for database migrations: Local → Staging → Production with gates at each stage.

<environments>

## Environments

| Environment | Project ID             | Host                                  | Purpose          |
| ----------- | ---------------------- | ------------------------------------- | ---------------- |
| Local       | -                      | `127.0.0.1:54321`                     | Development/test |
| Staging     | `<staging-project-id>` | `aws-0-us-west-2.pooler.supabase.com` | Vercel E2E tests |
| Production  | `<production-project-id>` | `aws-0-us-west-1.pooler.supabase.com` | Live users       |

</environments>

<mode_selection>

## Step 0: Choose Mode (Always First)

Use AskUserQuestion to determine workflow:

```typescript
{
  questions: [
    {
      question: "What do you want to do with this migration?",
      header: "Mode",
      options: [
        {
          label: "Test locally only",
          description:
            "Reset local DB, run pgTAP, setup dev. No commit, no push.",
        },
        {
          label: "Push to staging only",
          description:
            "Test locally first, then push to staging for E2E testing.",
        },
        {
          label: "Full deployment",
          description:
            "Complete pipeline: local → staging → production with all safety gates.",
        },
      ],
      multiSelect: false,
    },
  ];
}
```

</mode_selection>

---

<local_workflow>

## LOCAL MODE (6 Steps)

Quick local testing - no commits, no remote changes.

| Step | Action                  | Details                                               |
| ---- | ----------------------- | ----------------------------------------------------- |
| 1    | Verify migration date   | Ask user to confirm timestamp                         |
| 2    | Reset local database    | `npx supabase db reset` (runs seed.sql automatically) |
| 3    | Run pgTAP tests         | `npm run test:sql`                                    |
| 4    | Check for missing tests | Ask if new tests needed                               |
| 5    | Restart worker          | Kill and restart worker process                       |
| 6    | Generate types locally  | `npx supabase gen types typescript --local`           |

**Result**: Ready for testing at localhost:8080. No commit, no push.

For detailed steps: [references/local-mode.md](references/local-mode.md)
</local_workflow>

---

<staging_workflow>

## STAGING MODE (8 Steps)

Push to staging for Vercel E2E testing. Includes local testing first.

| Step | Action                  | Gate | Details                                                |
| ---- | ----------------------- | ---- | ------------------------------------------------------ |
| 1    | Verify migration date   | Ask  | Confirm timestamp is correct                           |
| 2    | Reset local database    |      | `npx supabase db reset`                                |
| 3    | Run pgTAP tests         |      | `npm run test:sql`                                     |
| 4    | Check for missing tests | Ask  | Create tests for new tables?                           |
| 5    | Link to staging         |      | `npx supabase link --project-ref <staging-project-id>` |
| 6    | Dry-run staging         |      | `npx supabase db push --dry-run`                       |
| 7    | Push to staging         | Ask  | `npx supabase db push`                                 |
| 8    | Verify migration        |      | Check `schema_migrations` table                        |

**Result**: Staging database updated. Ready for Vercel E2E tests.

**Note**: No backup needed for staging (test data only). No types commit (staging types not used).

</staging_workflow>

---

<production_workflow>

## FULL DEPLOYMENT MODE (18 Steps)

Complete promotion pipeline: Local → Staging → Production.

| Step | Action                      | Gate | Phase      |
| ---- | --------------------------- | ---- | ---------- |
| 1    | Verify migration date       | Ask  | Pre-flight |
| 2    | Reset local database        |      | Local      |
| 3    | Run pgTAP tests             |      | Local      |
| 4    | Check for missing tests     | Ask  | Local      |
| 5    | Link to staging             |      | Staging    |
| 6    | Dry-run staging             |      | Staging    |
| 7    | Push to staging             | Ask  | Staging    |
| 8    | Verify staging migration    |      | Staging    |
| 9    | Commit migration to Git     |      | Pre-prod   |
| 10   | Analyze rollback strategy   |      | Pre-prod   |
| 11   | Check locking impact        |      | Pre-prod   |
| 12   | Link to production          |      | Production |
| 13   | Request db push permission  | Ask  | Production |
| 14   | Dry-run production          |      | Production |
| 15   | Get final approval          | Ask  | Production |
| 16   | Create production backup    |      | Production |
| 17   | Push to production          |      | Production |
| 18   | Verify production migration |      | Production |
| 19   | Revoke permission           | Auto | Production |
| 20   | Generate types from prod    |      | Post-prod  |
| 21   | Commit types                |      | Post-prod  |

**Critical**: Each stage must pass before proceeding to the next.

For detailed steps: [references/production-mode.md](references/production-mode.md)
</production_workflow>

---

<approval_gates>

## Approval Gates

| Gate            | Phase | Question                                    |
| --------------- | ----- | ------------------------------------------- |
| Mode            | 0     | "Local, staging, or full deployment?"       |
| Date            | 1     | "Migration dated YYYY-MM-DD correct?"       |
| Tests           | 4     | "Create pgTAP tests for new tables?"        |
| Staging Push    | 7     | "Push to staging?"                          |
| Permission      | 13    | "Enable db push temporarily?"               |
| Production Push | 15    | "Push to production? (shows rollback plan)" |

All gates use AskUserQuestion tool. Stop execution if user declines.
</approval_gates>

---

<safety_rules>

## Safety Rules

### Rule 1: Sequential Promotion

```
Local → Staging → Production
```

Never skip environments. Each must pass before proceeding.

### Rule 2: Never Push Without Approval

Use AskUserQuestion before any `npx supabase db push` - required for safety.

### Rule 3: Follow Exact Workflow Order

Never skip or reorder steps within a phase.

### Rule 4: Stop on Any Error

- Local test fails → STOP, fix migration
- Staging push fails → STOP, investigate
- User denies → STOP, cancel deployment
- Production push fails → STOP, show rollback guidance

### Rule 5: Permission is Temporary

Add pattern to `bash_validator.py` before production push, remove immediately after.

### Rule 6: Backup Before Production Only

Staging uses test data - no backup needed. Production requires backup.
</safety_rules>

---

<quick_reference>

## Quick Reference

### Local Testing

```bash
npx supabase db reset
npm run test:sql
npx supabase gen types typescript --local > src/integrations/supabase/types.ts
```

### Staging Push

```bash
npx supabase link --project-ref <staging-project-id>
npx supabase db push --dry-run
npx supabase db push
```

### Production Push

```bash
npx supabase link --project-ref <production-project-id>
npx supabase db push --dry-run
npx supabase db push
npx supabase gen types typescript --project-id <production-project-id> > src/integrations/supabase/types.ts
```

### Verification (both staging and production)

```sql
SELECT version FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 1;
```

### Production Backup

```bash
PGPASSWORD="<password>" pg_dump -h aws-0-us-west-1.pooler.supabase.com -p 6543 \
  -U postgres.<production-project-id> -d postgres --schema=public -Fc \
  > supabase/backups/pre_migration_$(date +%Y%m%d_%H%M%S).dump
```

</quick_reference>

---

<when_to_use>

## When to Use

**Triggers for this skill:**

- "test migration"
- "push migration"
- "deploy migration"
- "reset local database"
- "apply migration"
- "db push"
- "test and push"
- "push to staging"
- "push to production"

**Ambiguous requests** → Ask which mode first
</when_to_use>

---

<references>

## Reference Files

Detailed documentation:

- [Local Mode Steps](references/local-mode.md)
- [Production Mode Steps](references/production-mode.md)
- [Rollback Patterns](references/rollback-patterns.md)
- [Error Recovery](references/error-recovery.md)
- [Command Reference](references/commands.md)

</references>

---

<version_history>

## Version History

- **v4.0.0** (2026-01-19): Add staging environment to promotion pipeline
  - New 3-tier workflow: Local → Staging → Production
  - Added staging project ID (`<staging-project-id>`)
  - New "Push to staging only" mode for E2E test prep
  - Full deployment now includes staging verification before production
  - Updated approval gates for staging push

- **v3.1.0** (2025-01-18): AI optimization updates
  - Add blockquote summary after title
  - Soften directive language (NON-NEGOTIABLE → required for production safety)

- **v3.0.0** (2025-12-28): Rewrite following Anthropic best practices
  - Progressive disclosure: Core in SKILL.md, details in references/
  - XML tags for structure
  - Imperative form throughout
  - Reduced from 1,071 lines to ~200 lines
- **v2.0.0** (2025-12-28): Merged with migration-test-local skill
  - Mode selection (local test vs production push)
  - Local mode: 7 steps, no commits
  - Production mode: 16 steps, full safety
- **v1.2.0** (2025-12-28): Industry best practices update
  - Test BEFORE commit workflow order
  - Added rollback analysis, locking check, backup, verification
- **v1.0.0** (2025-10-26): Initial release

</version_history>


---

## Referenced Files

> The following files are referenced in this skill and included for context.

### references/local-mode.md

```markdown
# Local Mode - Detailed Steps

Local testing workflow for database migrations. 6 steps, no commits, no production changes.

## Step 1: Verify Migration Date

Check the migration filename and confirm the timestamp:

```bash
ls -la supabase/migrations/*.sql | tail -5
```

Parse the timestamp from filename format: `YYYYMMDDHHMMSS_description.sql`

Use AskUserQuestion to confirm:

```typescript
{
  questions: [
    {
      question:
        "Found migration '20251224000000_salesforce_connections.sql' dated 2025-12-24. Is this date correct?",
      header: "Date Check",
      options: [
        {
          label: "Yes, date is correct",
          description: "Proceed with this timestamp",
        },
        {
          label: "No, wrong date",
          description: "Stop - rename migration file first",
        },
      ],
      multiSelect: false,
    },
  ];
}
```

## Step 2: Reset Local Database

Run the database reset to apply all migrations:

```bash
npx supabase db reset
```

Expected output:

```
Resetting local database...
Applying migrations...
  20251003165051_bootstrap_admin.sql
  20251025030217_change_duration_days.sql
  20251224000000_salesforce_connections.sql  ← NEW
Seeding data...
Local database ready.
```

If this fails, see [error-recovery.md](error-recovery.md#local-test-fails).

## Step 3: Run pgTAP Tests

Verify database health with pgTAP tests:

```bash
npm run test:sql
```

This runs all tests in `supabase/tests/pgtap/`:

- `00-schema/` - Schema validation
- `20-rls/` - RLS policy tests

If tests fail, stop and show the failure. Fix before proceeding.

## Step 4: Check for Missing Tests

Analyze the migration for new tables/policies that need tests:

1. Read migration file, identify:
   - `CREATE TABLE` statements
   - `CREATE POLICY` statements
   - New functions/triggers

2. Check existing tests in `supabase/tests/pgtap/`

3. If new tables found, ask user:

```typescript
{
  questions: [
    {
      question:
        "Migration creates new table 'salesforce_connections'. Create pgTAP tests?",
      header: "Tests",
      options: [
        {
          label: "Yes, create tests first",
          description: "Create schema and RLS tests",
        },
        { label: "No, skip tests", description: "Proceed without tests" },
      ],
      multiSelect: false,
    },
  ];
}
```

Test file naming:

- Schema: `supabase/tests/pgtap/00-schema/XX-table-name.test.sql`
- RLS: `supabase/tests/pgtap/20-rls/XX-table-name-rls.test.sql`

## Step 5: Restart Worker

**Critical**: When database resets, the worker's PostgreSQL connection becomes invalid but doesn't auto-reconnect. Worker appears running but stops processing events.

Find and kill the worker:

```bash
# Windows: Find worker PID
wmic process where "commandline like '%worker%' and name='node.exe'" get processid

# Kill stuck worker
taskkill //F //PID [PID]

# Restart worker
npm run worker:dev
```

Verify worker is processing:

```
[AutomationWorker] Starting automation worker...
[AutomationWorker] Listening for automation_outbox notifications
[AutomationWorker] Started polling every 2000ms
```

## Step 6: Generate Types Locally

Generate TypeScript types from the LOCAL database (not production):

```bash
npx supabase gen types typescript --local > src/integrations/supabase/types.ts
```

**Note**: These types are NOT committed in local mode - they're for local development only.

## Final Output

After successful completion:

```
Local Migration Test COMPLETE!

Migration: 20251224000000_salesforce_connections.sql
Date: 2025-12-24

Results:
  Database reset successful (seed.sql creates test users + analytics data)
  pgTAP tests passed
  Worker restarted
  Types generated (local)

Test Accounts:
  Admin: <test-admin-email> / <test-password>
  User: <test-user-email> / <test-password>

Next Steps:
  1. Test in browser at http://localhost:8080
  2. Log in with a test account
  3. Verify new features work
  4. When ready to deploy, say "push migration to production"

Migration NOT committed to Git yet
Production database NOT changed
```

```

### references/production-mode.md

```markdown
# Production Mode - Detailed Steps

Full 16-step production deployment with all safety checks.

## Steps 1-4: Pre-Commit Testing

Same as [local-mode.md](local-mode.md) steps 1-4:
1. Verify migration date
2. Reset local database (`npx supabase db reset`)
3. Run pgTAP tests (`npm run test:sql`)
4. Check for missing tests

**Critical**: Complete ALL testing before committing!

## Step 5: Commit to Git

Stage and commit migration files:

```bash
git add supabase/migrations/
git commit -m "feat(db): add [migration description]"
```

Pre-commit hook runs automatically:
- Validates SQL syntax with sqlfluff
- Checks for common anti-patterns
- Fails if validation errors found

If commit fails, see [error-recovery.md](error-recovery.md#commit-fails).

## Step 6: Analyze Rollback Strategy

Read the migration file and categorize statements:

| Type | Operations | Rollback |
|------|------------|----------|
| Reversible | `CREATE TABLE`, `ADD COLUMN`, `CREATE INDEX` | `DROP` |
| Partial | `ALTER COLUMN` | May lose precision |
| Irreversible | `DROP TABLE`, `DROP COLUMN`, `TRUNCATE` | Backup restore only |

Generate rollback SQL:
```sql
-- Example for CREATE TABLE salesforce_connections
DROP TABLE IF EXISTS public.salesforce_connections CASCADE;
```

Show rollback analysis to user:
```
Rollback Analysis:

Migration: 20251224000000_salesforce_connections.sql

Reversible operations:
  CREATE TABLE salesforce_connections → DROP TABLE
  CREATE POLICY ... → DROP POLICY

Irreversible operations:
  None detected

Rollback SQL (save this!):
  DROP TABLE IF EXISTS public.salesforce_connections CASCADE;
```

For irreversible changes, warn explicitly:
```
WARNING: This migration contains IRREVERSIBLE changes!

  - DROP COLUMN users.legacy_email
  - TRUNCATE TABLE temp_imports

Backup REQUIRED before proceeding.
```

See [rollback-patterns.md](rollback-patterns.md) for detailed patterns.

## Step 7: Check Locking Impact

Scan migration for blocking operations:

| Pattern | Impact | Fix |
|---------|--------|-----|
| `ALTER TABLE` without timeout | May lock table | Add `SET lock_timeout = '30s'` |
| `CREATE INDEX` without CONCURRENTLY | Blocks writes | Use `CREATE INDEX CONCURRENTLY` |
| `VACUUM FULL` | Exclusive lock | Schedule during maintenance |
| Large `UPDATE`/`DELETE` | Row locks | Batch operations |

If blocking operations found, warn:
```
Locking Impact Warning:

  Line 15: ALTER TABLE accounts ADD COLUMN ...
    May lock table during operation
    Consider: SET lock_timeout = '30s';

  Line 42: CREATE INDEX idx_sessions_date ...
    Will block writes during creation
    Consider: CREATE INDEX CONCURRENTLY ...
```

Suggested pattern:
```sql
SET statement_timeout = '300s';
SET lock_timeout = '30s';
```

## Step 8: Request Permission

The `npx supabase db push` command is blocked by `bash_validator.py` for safety.

Use AskUserQuestion:
```typescript
{
  questions: [{
    question: "To push migrations, I need to temporarily enable 'supabase db push'. Allow this?",
    header: "Permission",
    options: [
      { label: "Yes, enable temporarily", description: "Auto-revoked after push" },
      { label: "No, cancel", description: "Run command manually" }
    ],
    multiSelect: false
  }]
}
```

If approved, modify `.claude/hooks/bash_validator.py`:

Add to `ALLOWED_PATTERNS` array:
```python
r"supabase\s+db\s+push",  # TEMPORARY: migration-test-and-push skill
```

## Step 9: Dry-Run Preview

Show what will be pushed:

```bash
npx supabase db push --dry-run
```

Output:
```
Would apply the following migrations:
  20251224000000_salesforce_connections.sql

No changes will be made to the database.
```

**Note**: Dry-run does NOT validate SQL syntax - only shows which files would apply.

## Step 10: Get Final Approval

Use AskUserQuestion (REQUIRED):
```typescript
{
  questions: [{
    question: "Ready to push migration to PRODUCTION database?",
    header: "Production Push",
    options: [
      { label: "Yes, push now", description: "Apply to live database (irreversible)" },
      { label: "No, cancel", description: "Keep migration local only" }
    ],
    multiSelect: false
  }]
}
```

Only proceed if user explicitly approves.

## Step 11: Create Production Backup

Before pushing, create a backup:

```bash
mkdir -p supabase/backups

PGPASSWORD="<db_password>" pg_dump \
  -h aws-0-us-west-1.pooler.supabase.com \
  -p 6543 \
  -U postgres.<production-project-id> \
  -d postgres \
  --schema=public \
  -Fc \
  > supabase/backups/pre_migration_$(date +%Y%m%d_%H%M%S).dump
```

The password is in `.env.local` as `SUPABASE_DB_PASSWORD`.

Show confirmation:
```
Production Backup Created

  File: supabase/backups/pre_migration_20251228_143052.dump
  Size: 2.4 MB
  Schema: public

  To restore:
    pg_restore -h <host> -U <user> -d postgres --clean <file>
```

## Step 12: Push to Production

Execute the push:

```bash
npx supabase db push
```

Expected output:
```
Applying migrations to remote database...
  20251224000000_salesforce_connections.sql
Migrated supabase/migrations/20251224000000_salesforce_connections.sql
```

If push fails, see [error-recovery.md](error-recovery.md#production-push-fails).

## Step 13: Verify Migration Applied

Query the schema_migrations table:

```bash
PGPASSWORD="<password>" psql \
  -h aws-0-us-west-1.pooler.supabase.com \
  -p 6543 \
  -U postgres.<production-project-id> \
  -d postgres \
  -c "SELECT version, statements_applied_at FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 3;"
```

Expected output:
```
      version       |    statements_applied_at
--------------------+----------------------------
 20251224000000     | 2025-12-28 14:31:05.123456  ← NEW
 20251220123456     | 2025-12-20 12:35:00.000000
```

Confirm:
```
Migration Verified

  Version: 20251224000000_salesforce_connections
  Applied at: 2025-12-28 14:31:05 UTC

  Migration is now LIVE in production!
```

## Step 14: Revoke Permission (Automatic)

Remove the temporary pattern from `.claude/hooks/bash_validator.py`:

Remove this line from `ALLOWED_PATTERNS`:
```python
r"supabase\s+db\s+push",  # TEMPORARY: migration-test-and-push skill
```

No user approval needed - this is security restoration.

## Step 15: Generate Types

Generate TypeScript types from production:

```bash
npx supabase gen types typescript --project-id <production-project-id> > src/integrations/supabase/types.ts
```

If this fails, see [error-recovery.md](error-recovery.md#type-generation-fails).

## Step 16: Commit Types

Stage and commit the updated types:

```bash
git add src/integrations/supabase/types.ts
git commit -m "chore: update Supabase types after migration"
```

## Final Output

```
Migration Test & Push COMPLETE!

Timeline:
  1. Committed: 20251224000000_salesforce_connections.sql
  2. Local test: PASSED
  3. User approved production push
  4. Backup created: pre_migration_20251228_143052.dump
  5. Pushed to production
  6. Verified: Migration applied at 14:31:05 UTC
  7. Types generated and committed

Your migration is now LIVE in production!

Next steps:
  - Test the feature in production UI
  - Monitor error logs for issues
  - If problems occur, create rollback migration
```

```

### references/rollback-patterns.md

```markdown
# Rollback Patterns

Analysis patterns for migration rollback planning.

## Operation Categories

### Reversible Operations

These can be rolled back with a simple DROP statement:

| Operation | Rollback |
|-----------|----------|
| `CREATE TABLE table_name` | `DROP TABLE IF EXISTS table_name CASCADE` |
| `CREATE INDEX idx_name` | `DROP INDEX IF EXISTS idx_name` |
| `CREATE POLICY policy_name` | `DROP POLICY IF EXISTS policy_name ON table_name` |
| `CREATE FUNCTION func_name` | `DROP FUNCTION IF EXISTS func_name` |
| `CREATE TRIGGER trigger_name` | `DROP TRIGGER IF EXISTS trigger_name ON table_name` |
| `ADD COLUMN col_name` | `ALTER TABLE t DROP COLUMN IF EXISTS col_name` |
| `ADD CONSTRAINT` | `ALTER TABLE t DROP CONSTRAINT IF EXISTS name` |

Example rollback script:
```sql
-- Rollback for 20251224000000_salesforce_connections.sql
BEGIN;

DROP POLICY IF EXISTS "Users can view own connections" ON salesforce_connections;
DROP POLICY IF EXISTS "Users can manage own connections" ON salesforce_connections;
DROP TABLE IF EXISTS public.salesforce_connections CASCADE;

COMMIT;
```

### Partially Reversible Operations

These can be reversed but may lose data:

| Operation | Impact | Rollback |
|-----------|--------|----------|
| `ALTER COLUMN type` (expanding) | Safe | Reverse ALTER |
| `ALTER COLUMN type` (shrinking) | Data truncation | May fail |
| `ALTER COLUMN SET DEFAULT` | Previous default lost | Set old default |
| `ALTER COLUMN SET NOT NULL` | Nulls filled with default | DROP NOT NULL |

Example:
```sql
-- Forward: Expand varchar
ALTER TABLE users ALTER COLUMN name TYPE VARCHAR(500);

-- Rollback: Shrink (may fail if data > 255)
ALTER TABLE users ALTER COLUMN name TYPE VARCHAR(255);
```

### Irreversible Operations

These CANNOT be rolled back without a backup:

| Operation | Data Loss | Recovery |
|-----------|-----------|----------|
| `DROP TABLE` | Complete | Restore from backup |
| `DROP COLUMN` | Column data | Restore from backup |
| `TRUNCATE` | All rows | Restore from backup |
| `DELETE` (mass) | Deleted rows | Restore from backup |
| Data migration | Original values | Restore from backup |

**Always create backup before these operations!**

## Rollback Script Generation

For each migration, generate rollback SQL:

### Template
```sql
-- Rollback for [migration_file]
-- WARNING: Run only if forward migration needs to be undone
-- Created: [date]

BEGIN;

-- Reverse operations in OPPOSITE order of migration

-- [rollback statements here]

COMMIT;
```

### Example: New Table with Policies
```sql
-- Forward migration
CREATE TABLE salesforce_connections (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  user_id UUID REFERENCES auth.users(id),
  instance_url TEXT NOT NULL,
  created_at TIMESTAMPTZ DEFAULT now()
);

ALTER TABLE salesforce_connections ENABLE ROW LEVEL SECURITY;

CREATE POLICY "Users can view own" ON salesforce_connections
  FOR SELECT USING (auth.uid() = user_id);
```

```sql
-- Rollback script
BEGIN;

DROP POLICY IF EXISTS "Users can view own" ON salesforce_connections;
DROP TABLE IF EXISTS public.salesforce_connections CASCADE;

COMMIT;
```

### Example: Adding Columns
```sql
-- Forward migration
ALTER TABLE sessions ADD COLUMN sf_opportunity_id TEXT;
ALTER TABLE sessions ADD COLUMN sf_last_sync TIMESTAMPTZ;
```

```sql
-- Rollback script
BEGIN;

ALTER TABLE sessions DROP COLUMN IF EXISTS sf_last_sync;
ALTER TABLE sessions DROP COLUMN IF EXISTS sf_opportunity_id;

COMMIT;
```

## Rollback Decision Matrix

| Situation | Action |
|-----------|--------|
| Migration failed mid-execution | Assess damage, may need manual cleanup |
| Feature needs to be reverted | Run rollback script if reversible |
| Data corruption detected | Restore from backup |
| Performance issues | Analyze, may need index adjustment |

## Pre-Migration Checklist

Before any migration with destructive operations:

- [ ] Backup created (pg_dump)
- [ ] Rollback script prepared
- [ ] Rollback script tested in dev
- [ ] Team notified of potential rollback
- [ ] Monitoring alerts configured
- [ ] Runbook for rollback procedure documented

```

### references/error-recovery.md

```markdown
# Error Recovery

Common error scenarios and their fixes.

## Commit Fails (sqlfluff errors) {#commit-fails}

**Problem**: SQL syntax errors or anti-patterns detected by pre-commit hook.

**Symptoms**:
```
sqlfluff validation failed!
  Line 15: Missing semicolon
  Line 32: Column "email" does not exist
```

**Fix**:
1. Read error messages carefully
2. Fix SQL in migration file
3. Run skill again (re-attempts commit)

**No database changes made** - safe to retry.

## Local Test Fails {#local-test-fails}

**Problem**: Migration breaks on `npx supabase db reset`.

**Common causes**:
- Column already exists
- Table already exists
- Constraint violation
- Missing dependency

**Symptoms**:
```
ERROR: column "email" already exists
LINE 15: ALTER TABLE users ADD COLUMN email VARCHAR(255);
```

**Fix - Make idempotent**:
```sql
-- BAD: Fails if column exists
ALTER TABLE users ADD COLUMN email VARCHAR(255);

-- GOOD: Idempotent
DO $$
BEGIN
  IF NOT EXISTS (
    SELECT 1 FROM information_schema.columns
    WHERE table_name='users' AND column_name='email'
  ) THEN
    ALTER TABLE users ADD COLUMN email VARCHAR(255);
  END IF;
END $$;
```

**Recovery**:
1. Fix migration SQL
2. Run `npx supabase db reset`
3. Run skill again

**No production changes made** - safe to retry.

## Production Push Fails {#production-push-fails}

**Problem**: Migration works locally but fails in production.

**Common causes**:
- Production has different data than local
- Migration assumes empty tables
- Constraint violations with real data
- Foreign key references

**Symptoms**:
```
ERROR: relation "user_preferences" does not exist
LINE 42: INSERT INTO user_preferences...
```

**Important**:
- **DO NOT** try pushing same migration again
- Production is UNCHANGED (still safe)
- Local is now OUT OF SYNC with production

**Fix**:
1. Analyze why production differs from local
2. Create NEW corrective migration
3. Test with realistic data
4. Deploy new migration

**To resync local with production**:
```bash
# Delete failed migration locally
rm supabase/migrations/20251224000000_failing_migration.sql

# Reset local
npx supabase db reset
```

## Type Generation Fails {#type-generation-fails}

**Problem**: Cannot connect to Supabase or generate types.

**Symptoms**:
```
Error connecting to Supabase project
```

**Fix - Run manually**:
```bash
npx supabase gen types typescript --project-id <production-project-id> > src/integrations/supabase/types.ts
git add src/integrations/supabase/types.ts
git commit -m "chore: update Supabase types after migration"
```

**Note**: Migration is still successful - types are just documentation.

## Permission Denied (BLOCKED by hook) {#blocked-by-hook}

**Problem**: Bash validator blocks `db push` command.

**Symptoms**:
```
BLOCKED: supabase db push to remote (use --local)
```

**Fix**:
This is expected! The skill handles permission management automatically.

If running manually:
1. Edit `.claude/hooks/bash_validator.py`
2. Add `r"supabase\s+db\s+push"` to `ALLOWED_PATTERNS`
3. Run the push command
4. Remove the pattern after push

## Project Not Linked {#project-not-linked}

**Problem**: Supabase CLI doesn't know which project to push to.

**Symptoms**:
```
Cannot find project ref. Have you run supabase link?
```

**Fix**:
```bash
npx supabase link --project-ref <production-project-id>
```

## Migration Files Out of Order {#out-of-order}

**Problem**: Local migrations have timestamps before last remote migration.

**Symptoms**:
```
Found local migration files to be inserted before the last migration on remote database
```

**Fix**:
```bash
# Force apply out-of-order migrations
npx supabase db push --include-all
```

**Prevention**: Always use current timestamp when creating migrations.

## pgTAP Tests Fail {#pgtap-fails}

**Problem**: Database tests fail after reset.

**Symptoms**:
```
not ok 1 - Table 'salesforce_connections' should exist
```

**Common causes**:
- Migration has syntax error
- Table name mismatch (test vs migration)
- RLS policy name mismatch

**Fix**:
1. Check migration file for typos
2. Verify test assertions match migration
3. Run `npx supabase db reset` to retry

## Worker Not Processing {#worker-stuck}

**Problem**: After db reset, worker stops processing events.

**Symptoms**:
- Automation events not processed
- No worker log output
- Worker appears running but idle

**Cause**: PostgreSQL connection invalidated but not auto-reconnected.

**Fix**:
```bash
# Find and kill worker
wmic process where "commandline like '%worker%' and name='node.exe'" get processid
taskkill //F //PID [PID]

# Restart
npm run worker:dev
```

## Backup Restore Needed

**Problem**: Migration caused data corruption, need to restore.

**Restore command**:
```bash
PGPASSWORD="<password>" pg_restore \
  -h aws-0-us-west-1.pooler.supabase.com \
  -p 6543 \
  -U postgres.<production-project-id> \
  -d postgres \
  --clean \
  supabase/backups/pre_migration_YYYYMMDD_HHMMSS.dump
```

**Warning**: This OVERWRITES current production data with backup state.

```

### references/commands.md

```markdown
# Command Reference

All commands used by the migration skill.

## Project Constants

| Environment | Project ID             | Pooler Host                           | Pooler Port | Username                        |
| ----------- | ---------------------- | ------------------------------------- | ----------- | ------------------------------- |
| Staging     | `<staging-project-id>` | `aws-0-us-west-2.pooler.supabase.com` | `5432`      | `postgres.<staging-project-id>` |
| Production  | `<production-project-id>` | `aws-0-us-west-1.pooler.supabase.com` | `6543`      | `postgres.<production-project-id>` |

## Pre-Flight Checks

```bash
# List migration files
ls -la supabase/migrations/*.sql

# Check git status
git status

# Check Supabase status
npx supabase status

# Link to staging
npx supabase link --project-ref <staging-project-id>

# Link to production
npx supabase link --project-ref <production-project-id>
```

## Local Testing

```bash
# Reset local database (applies all migrations)
npx supabase db reset

# Run pgTAP tests
npm run test:sql

# Generate types from local database
npx supabase gen types typescript --local > src/integrations/supabase/types.ts
```

## Worker Management

```bash
# Find worker PID (Windows)
wmic process where "commandline like '%worker%' and name='node.exe'" get processid

# Kill worker
taskkill //F //PID [PID]

# Restart worker
npm run worker:dev
```

## Staging Push

```bash
# Link to staging first
npx supabase link --project-ref <staging-project-id>

# Dry-run (show what would be pushed)
npx supabase db push --dry-run

# Push to staging
npx supabase db push

# Verify staging migration
PGPASSWORD="<password>" psql \
  -h aws-0-us-west-2.pooler.supabase.com \
  -p 5432 \
  -U postgres.<staging-project-id> \
  -d postgres \
  -c "SELECT version FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 1;"
```

## Production Push

```bash
# Link to production first
npx supabase link --project-ref <production-project-id>

# Dry-run (show what would be pushed)
npx supabase db push --dry-run

# Push to production (requires permission)
npx supabase db push

# Force push out-of-order migrations
npx supabase db push --include-all
```

## Backup

```bash
# Create backup directory
mkdir -p supabase/backups

# Backup production database
PGPASSWORD="<password>" pg_dump \
  -h aws-0-us-west-1.pooler.supabase.com \
  -p 6543 \
  -U postgres.<production-project-id> \
  -d postgres \
  --schema=public \
  -Fc \
  > supabase/backups/pre_migration_$(date +%Y%m%d_%H%M%S).dump

# Restore from backup
PGPASSWORD="<password>" pg_restore \
  -h aws-0-us-west-1.pooler.supabase.com \
  -p 6543 \
  -U postgres.<production-project-id> \
  -d postgres \
  --clean \
  supabase/backups/<backup_file>.dump
```

## Type Generation

```bash
# From production (after push)
npx supabase gen types typescript --project-id <production-project-id> > src/integrations/supabase/types.ts

# From local (during testing)
npx supabase gen types typescript --local > src/integrations/supabase/types.ts
```

## Verification

```bash
# Check migration status in staging
PGPASSWORD="<password>" psql \
  -h aws-0-us-west-2.pooler.supabase.com \
  -p 5432 \
  -U postgres.<staging-project-id> \
  -d postgres \
  -c "SELECT version, statements_applied_at FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 5;"

# Check migration status in production
PGPASSWORD="<password>" psql \
  -h aws-0-us-west-1.pooler.supabase.com \
  -p 6543 \
  -U postgres.<production-project-id> \
  -d postgres \
  -c "SELECT version, statements_applied_at FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 5;"
```

## Git Operations

```bash
# Stage migration files
git add supabase/migrations/

# Commit migration
git commit -m "feat(db): add [description]"

# Stage types file
git add src/integrations/supabase/types.ts

# Commit types
git commit -m "chore: update Supabase types after migration"
```

## Environment Variables

| Variable                 | Location     | Purpose                      |
| ------------------------ | ------------ | ---------------------------- |
| `SUPABASE_DB_PASSWORD`   | `.env.local` | Production database password |
| `VITE_SUPABASE_URL`      | `.env.local` | Supabase project URL         |
| `VITE_SUPABASE_ANON_KEY` | `.env.local` | Supabase anon key            |

## Pre-Flight Checklist

Before starting:

- [ ] Migration file exists in `supabase/migrations/`
- [ ] Migration uses idempotent patterns (IF NOT EXISTS)
- [ ] RLS policies included for new tables
- [ ] Git working directory is clean
- [ ] Local Supabase is running
- [ ] Time available to monitor production after push

```

migration-test-and-push | SkillHub