undetectable-ai
Make AI text undetectable. TypeScript CLI that scans ChatGPT, Claude, OpenClaw output for patterns flagged by GPTZero, Turnitin, Originality.ai, then auto-fixes them to pass as human-written. Use to bypass AI detection, evade checkers, or humanize text before submission.
Packaged view
This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.
Install command
npx @skill-hub/cli install openclaw-skills-undetectable-ai
Repository
Skill path: skills/artur-zhdan/undetectable-ai
Make AI text undetectable. TypeScript CLI that scans ChatGPT, Claude, OpenClaw output for patterns flagged by GPTZero, Turnitin, Originality.ai, then auto-fixes them to pass as human-written. Use to bypass AI detection, evade checkers, or humanize text before submission.
Open repositoryBest for
Primary workflow: Analyze Data & AI.
Technical facets: Full Stack, Data / AI.
Target audience: everyone.
License: Unknown.
Original source
Catalog source: SkillHub Club.
Repository owner: openclaw.
This is still a mirrored public skill entry. Review the repository before installing into production workflows.
What it helps with
- Install undetectable-ai into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
- Review https://github.com/openclaw/skills before adding undetectable-ai to shared team environments
- Use undetectable-ai for development workflows
Works across
Favorites: 0.
Sub-skills: 0.
Aggregator: No.
Original source / Raw SKILL.md
---
name: undetectable-ai
description: Make AI text undetectable. TypeScript CLI that scans ChatGPT, Claude, OpenClaw output for patterns flagged by GPTZero, Turnitin, Originality.ai, then auto-fixes them to pass as human-written. Use to bypass AI detection, evade checkers, or humanize text before submission.
allowed-tools:
- Read
- Write
- StrReplace
- Shell
- Glob
---
# Undetectable AI
TypeScript CLI tools for bypassing AI content detectors by removing telltale patterns.
## Setup
```bash
cd scripts && npm install
```
## Scripts
### analyze.ts — Scan for Detectable Patterns
Finds patterns that AI detectors look for: vocabulary, sentence structure, artifacts.
```bash
# Analyze a file
npx ts-node scripts/analyze.ts input.txt
# From stdin
echo "This serves as a testament" | npx ts-node scripts/analyze.ts
# JSON output
npx ts-node scripts/analyze.ts input.txt --json
```
**Output:**
```
==================================================
AI DETECTION SCAN - 5 issues found
==================================================
AI VOCABULARY:
• testament: 1x
• crucial: 2x
AUTO-FIXABLE:
• "serves as" → "is": 1x
```
---
### transform.ts — Auto-Fix Patterns
Rewrites text to evade detection.
```bash
# Transform and print
npx ts-node scripts/transform.ts input.txt
# Write to file
npx ts-node scripts/transform.ts input.txt -o output.txt
# Fix em dashes too
npx ts-node scripts/transform.ts input.txt --fix-dashes
# Quiet mode
npx ts-node scripts/transform.ts input.txt -q
```
**What it fixes:**
- Filler phrases: "in order to" → "to"
- AI vocabulary: "utilize" → "use", "leverage" → "use"
- Sentence starters: removes "Additionally,", "Furthermore,"
- Chatbot artifacts: removes entire sentences with "I hope this helps", etc.
- Curly quotes → straight quotes
- Capitalization after removals
---
## Workflow
1. **Scan** to see detection risk:
```bash
npx ts-node scripts/analyze.ts essay.txt
```
2. **Auto-fix** mechanical patterns:
```bash
npx ts-node scripts/transform.ts essay.txt -o essay_clean.txt
```
3. **Manual pass** for flagged AI vocabulary (requires judgment)
4. **Re-scan** to verify:
```bash
npx ts-node scripts/analyze.ts essay_clean.txt
```
---
## Customizing
Edit `scripts/patterns.json`:
- `ai_words` — vocabulary to flag (manual fix needed)
- `puffery` — promotional language to flag
- `replacements` — auto-replace mappings
- `chatbot_artifacts` — phrases that trigger full sentence removal
---
## Batch Processing
```bash
# Scan all docs
for f in *.txt; do
echo "=== $f ==="
npx ts-node scripts/analyze.ts "$f"
done
# Transform all
for f in *.md; do
npx ts-node scripts/transform.ts "$f" -o "${f%.md}_clean.md" -q
done
```
---
## Referenced Files
> The following files are referenced in this skill and included for context.
### scripts/analyze.ts
```typescript
#!/usr/bin/env npx ts-node
import { readFileSync } from "fs";
import { dirname, join } from "path";
import { fileURLToPath } from "url";
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
const PATTERNS = JSON.parse(readFileSync(join(__dirname, "patterns.json"), "utf-8"));
type Match = [string, number];
interface Results {
ai_words: Match[];
puffery: Match[];
chatbot_artifacts: Match[];
hedging: Match[];
em_dashes: number;
curly_quotes: number;
replaceable: Match[];
total_issues: number;
}
const findMatches = (text: string, patterns: string[]): Match[] => {
const lower = text.toLowerCase();
return patterns
.map((p): Match => [p, (lower.match(new RegExp(p.toLowerCase(), "g")) || []).length])
.filter(([, c]) => c > 0)
.sort((a, b) => b[1] - a[1]);
};
const analyze = (text: string): Results => {
const results: Results = {
ai_words: findMatches(text, PATTERNS.ai_words),
puffery: findMatches(text, PATTERNS.puffery),
chatbot_artifacts: findMatches(text, PATTERNS.chatbot_artifacts),
hedging: findMatches(text, PATTERNS.hedging_phrases),
em_dashes: (text.match(/—|--/g) || []).length,
curly_quotes: (text.match(/[""]/g) || []).length,
replaceable: Object.keys(PATTERNS.replacements)
.map((k): Match => [k, (text.toLowerCase().match(new RegExp(k.toLowerCase(), "g")) || []).length])
.filter(([, c]) => c > 0),
total_issues: 0,
};
results.total_issues =
results.ai_words.reduce((s, [, c]) => s + c, 0) +
results.puffery.reduce((s, [, c]) => s + c, 0) +
results.chatbot_artifacts.reduce((s, [, c]) => s + c, 0) +
results.hedging.reduce((s, [, c]) => s + c, 0) +
results.em_dashes +
results.curly_quotes +
results.replaceable.reduce((s, [, c]) => s + c, 0);
return results;
};
const printReport = (r: Results) => {
console.log(`\n${"=".repeat(50)}`);
console.log(`AI DETECTION SCAN - ${r.total_issues} issues found`);
console.log(`${"=".repeat(50)}\n`);
if (r.ai_words.length) {
console.log("AI VOCABULARY:");
r.ai_words.forEach(([w, c]) => console.log(` • ${w}: ${c}x`));
console.log();
}
if (r.puffery.length) {
console.log("PUFFERY/PROMOTIONAL:");
r.puffery.forEach(([w, c]) => console.log(` • ${w}: ${c}x`));
console.log();
}
if (r.chatbot_artifacts.length) {
console.log("CHATBOT ARTIFACTS:");
r.chatbot_artifacts.forEach(([p, c]) => console.log(` • "${p}": ${c}x`));
console.log();
}
if (r.hedging.length) {
console.log("EXCESSIVE HEDGING:");
r.hedging.forEach(([p, c]) => console.log(` • "${p}": ${c}x`));
console.log();
}
if (r.replaceable.length) {
console.log("AUTO-FIXABLE:");
r.replaceable.forEach(([p, c]) => {
const repl = PATTERNS.replacements[p];
console.log(` • "${p}" → ${repl ? `"${repl}"` : "(remove)"}: ${c}x`);
});
console.log();
}
if (r.em_dashes > 2) console.log(`EM DASHES: ${r.em_dashes} (consider reducing)\n`);
if (r.curly_quotes) console.log(`CURLY QUOTES: ${r.curly_quotes} (replace with straight)\n`);
if (r.total_issues === 0) console.log("✓ Text appears human-written.\n");
};
const main = () => {
const args = process.argv.slice(2);
const jsonOutput = args.includes("--json");
const file = args.find((a) => !a.startsWith("-"));
let text: string;
if (file) {
text = readFileSync(file, "utf-8");
} else {
text = readFileSync(0, "utf-8");
}
const results = analyze(text);
if (jsonOutput) {
console.log(JSON.stringify(results, null, 2));
} else {
printReport(results);
}
};
main();
```
### scripts/transform.ts
```typescript
#!/usr/bin/env npx ts-node
import { readFileSync, writeFileSync } from "fs";
import { dirname, join } from "path";
import { fileURLToPath } from "url";
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
const PATTERNS = JSON.parse(readFileSync(join(__dirname, "patterns.json"), "utf-8"));
const replacePhrases = (text: string): [string, string[]] => {
const changes: string[] = [];
for (const [old, replacement] of Object.entries(PATTERNS.replacements)) {
const flags = "gi";
const pattern = old.includes(" ") || old.endsWith(",")
? new RegExp(old.replace(/[.*+?^${}()|[\]\\]/g, "\\$&"), flags)
: new RegExp(`\\b${old.replace(/[.*+?^${}()|[\]\\]/g, "\\$&")}\\b`, flags);
const matches = text.match(pattern);
if (matches) {
changes.push(`"${old}" → ${replacement ? `"${replacement}"` : "(removed)"} (${matches.length}x)`);
text = text.replace(pattern, replacement as string);
}
}
return [text, changes];
};
const fixCurlyQuotes = (text: string): [string, boolean] => {
const original = text;
text = text.replace(/[""]/g, '"').replace(/['']/g, "'");
return [text, text !== original];
};
const removeChatbotArtifacts = (text: string): [string, string[]] => {
const changes: string[] = [];
for (const artifact of PATTERNS.chatbot_artifacts) {
const escaped = artifact.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
const pattern = new RegExp(`[^.!?\\n]*${escaped}[^.!?\\n]*[.!?]?\\s*`, "gi");
if (pattern.test(text)) {
changes.push(`Removed sentence with "${artifact}"`);
text = text.replace(pattern, "");
}
}
return [text, changes];
};
const reduceEmDashes = (text: string): [string, number] => {
const count = (text.match(/—/g) || []).length;
text = text.replace(/\s*—\s*/g, ", ").replace(/--/g, ", ");
return [text, count];
};
const cleanWhitespace = (text: string): string => {
return text.replace(/ +/g, " ").replace(/\n{3,}/g, "\n\n").replace(/^\s+/gm, "").trim();
};
const fixCapitalization = (text: string): string => {
return text
.replace(/(^|[.!?]\s+)([a-z])/g, (_, pre, char) => pre + char.toUpperCase())
.replace(/^\s*([a-z])/, (_, char) => char.toUpperCase());
};
interface Options {
fixDashes?: boolean;
removeArtifacts?: boolean;
}
const transform = (text: string, opts: Options = {}): [string, string[]] => {
const { fixDashes = false, removeArtifacts = true } = opts;
const allChanges: string[] = [];
let changes: string[];
[text, changes] = replacePhrases(text);
allChanges.push(...changes);
let fixed: boolean;
[text, fixed] = fixCurlyQuotes(text);
if (fixed) allChanges.push("Fixed curly quotes → straight quotes");
if (removeArtifacts) {
[text, changes] = removeChatbotArtifacts(text);
allChanges.push(...changes);
}
if (fixDashes) {
let count: number;
[text, count] = reduceEmDashes(text);
if (count) allChanges.push(`Replaced ${count} em dashes with commas`);
}
text = cleanWhitespace(text);
text = fixCapitalization(text);
return [text, allChanges];
};
const main = () => {
const args = process.argv.slice(2);
const quiet = args.includes("-q") || args.includes("--quiet");
const fixDashes = args.includes("--fix-dashes");
const keepArtifacts = args.includes("--keep-artifacts");
const outputIdx = args.findIndex((a) => a === "-o" || a === "--output");
const outputFile = outputIdx !== -1 ? args[outputIdx + 1] : null;
const inputFile = args.find((a) => !a.startsWith("-") && a !== outputFile);
let text: string;
if (inputFile) {
text = readFileSync(inputFile, "utf-8");
} else {
text = readFileSync(0, "utf-8");
}
const [result, changes] = transform(text, { fixDashes, removeArtifacts: !keepArtifacts });
if (!quiet && changes.length) {
console.error("CHANGES MADE:");
changes.forEach((c) => console.error(` • ${c}`));
console.error();
}
if (outputFile) {
writeFileSync(outputFile, result);
if (!quiet) console.error(`Written to ${outputFile}`);
} else {
console.log(result);
}
};
main();
```
### scripts/patterns.json
```json
{
"ai_words": [
"additionally", "crucial", "delve", "emphasizing", "enduring",
"enhance", "fostering", "garner", "furthermore", "intricate",
"intricacies", "landscape", "pivotal", "showcase", "showcasing",
"tapestry", "testament", "underscore", "underscores", "vibrant",
"interplay", "multifaceted", "nuanced", "paradigm", "synergy",
"realm", "underpins", "unraveling", "unveiling", "leveraging"
],
"puffery": [
"groundbreaking", "renowned", "stunning", "breathtaking",
"nestled", "bustling", "vital", "rich cultural heritage",
"natural beauty", "indelible mark", "pivotal moment",
"game-changing", "cutting-edge", "state-of-the-art"
],
"replacements": {
"in order to": "to",
"due to the fact that": "because",
"at this point in time": "now",
"has the ability to": "can",
"it is important to note that": "",
"it should be noted that": "",
"serves as": "is",
"stands as": "is",
"boasts a": "has a",
"boasts an": "has an",
"boasts": "has",
"utilize": "use",
"utilizes": "uses",
"utilizing": "using",
"leverage": "use",
"leverages": "uses",
"leveraging": "using",
"facilitate": "help",
"facilitates": "helps",
"Additionally,": "",
"Furthermore,": "",
"Moreover,": "",
"In conclusion,": "",
"To summarize,": ""
},
"chatbot_artifacts": [
"I hope this helps",
"Let me know if",
"Would you like me to",
"Great question",
"Excellent question",
"You're absolutely right",
"That's a great point",
"Certainly!",
"Of course!",
"Happy to help",
"I'd be happy to",
"Feel free to"
],
"hedging_phrases": [
"it could potentially",
"it might possibly",
"arguably",
"it could be argued that",
"some would say",
"in some ways"
]
}
```
---
## Skill Companion Files
> Additional files collected from the skill directory layout.
### _meta.json
```json
{
"owner": "artur-zhdan",
"slug": "undetectable-ai",
"displayName": "Undetectable AI",
"latest": {
"version": "1.0.0",
"publishedAt": 1769990637002,
"commit": "https://github.com/clawdbot/skills/commit/7c30651a7d8889ddf9e0fe2243d194097c13bbd4"
},
"history": []
}
```