Back to skills
SkillHub ClubGrow & DistributeFull StackBackendTech Writer

seo-dataforseo

SEO keyword research using the DataForSEO API. Perform keyword analysis, YouTube keyword research, competitor analysis, SERP analysis, and trend tracking. Use when the user asks to: research keywords, analyze search volume/CPC/competition, find keyword suggestions, check keyword difficulty, analyze competitors, get trending topics, do YouTube SEO research, or optimize landing page keywords. Requires a DataForSEO API account and credentials in .env file.

Packaged view

This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.

Stars
0
Hot score
74
Updated
March 20, 2026
Overall rating
C4.0
Composite score
4.0
Best-practice grade
B75.6

Install command

npx @skill-hub/cli install openclaw-skills-seo-dataforseo

Repository

openclaw/skills

Skill path: skills/adamkristopher/seo-dataforseo

SEO keyword research using the DataForSEO API. Perform keyword analysis, YouTube keyword research, competitor analysis, SERP analysis, and trend tracking. Use when the user asks to: research keywords, analyze search volume/CPC/competition, find keyword suggestions, check keyword difficulty, analyze competitors, get trending topics, do YouTube SEO research, or optimize landing page keywords. Requires a DataForSEO API account and credentials in .env file.

Open repository

Best for

Primary workflow: Grow & Distribute.

Technical facets: Full Stack, Backend, Tech Writer.

Target audience: everyone.

License: Unknown.

Original source

Catalog source: SkillHub Club.

Repository owner: openclaw.

This is still a mirrored public skill entry. Review the repository before installing into production workflows.

What it helps with

  • Install seo-dataforseo into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
  • Review https://github.com/openclaw/skills before adding seo-dataforseo to shared team environments
  • Use seo-dataforseo for development workflows

Works across

Claude CodeCodex CLIGemini CLIOpenCode

Favorites: 0.

Sub-skills: 0.

Aggregator: No.

Original source / Raw SKILL.md

---
name: seo-dataforseo
description: "SEO keyword research using the DataForSEO API. Perform keyword analysis, YouTube keyword research, competitor analysis, SERP analysis, and trend tracking. Use when the user asks to: research keywords, analyze search volume/CPC/competition, find keyword suggestions, check keyword difficulty, analyze competitors, get trending topics, do YouTube SEO research, or optimize landing page keywords. Requires a DataForSEO API account and credentials in .env file."
---

# SEO Keyword Research (DataForSEO)

## Setup

Install dependencies:

```bash
pip install -r scripts/requirements.txt
```

Configure credentials by creating a `.env` file in the project root:

```
[email protected]
DATAFORSEO_PASSWORD=your_api_password
```

Get credentials from: https://app.dataforseo.com/api-access

## Quick Start

| User says | Function to call |
|-----------|-----------------|
| "Research keywords for [topic]" | `keyword_research("topic")` |
| "YouTube keyword data for [idea]" | `youtube_keyword_research("idea")` |
| "Analyze competitor [domain.com]" | `competitor_analysis("domain.com")` |
| "What's trending?" | `trending_topics()` |
| "Keyword analysis for [list]" | `full_keyword_analysis(["kw1", "kw2"])` |
| "Landing page keywords for [topic]" | `landing_page_keyword_research(["kw1"], "competitor.com")` |

Execute functions by importing from `scripts/main.py`:

```python
import sys
from pathlib import Path
sys.path.insert(0, str(Path("scripts")))
from main import *

result = keyword_research("AI website builders")
```

## Workflow Pattern

Every research task follows three phases:

### 1. Research
Run API functions. Each function call hits the DataForSEO API and returns structured data.

### 2. Auto-Save
All results automatically save as timestamped JSON files to `results/{category}/`. File naming pattern: `YYYYMMDD_HHMMSS__operation__keyword__extra_info.json`

### 3. Summarize
After research, read the saved JSON files and create a markdown summary in `results/summary/` with data tables, ranked opportunities, and strategic recommendations.

## High-Level Functions

These are the primary functions in `scripts/main.py`. Each orchestrates multiple API calls for a complete research workflow.

| Function | Purpose | What it gathers |
|----------|---------|----------------|
| `keyword_research(keyword)` | Single keyword deep-dive | Overview, suggestions, related keywords, difficulty |
| `youtube_keyword_research(keyword)` | YouTube content research | Overview, suggestions, YouTube SERP rankings, YouTube trends |
| `landing_page_keyword_research(keywords, competitor_domain)` | Landing page SEO | Overview, intent, difficulty, SERP analysis, competitor keywords |
| `full_keyword_analysis(keywords)` | Strategic content planning | Overview, difficulty, intent, keyword ideas, historical volume, Google Trends |
| `competitor_analysis(domain, keywords)` | Competitor intelligence | Domain keywords, Google Ads keywords, competitor domains |
| `trending_topics(location_name)` | Current trends | Currently trending searches |

### Parameters

All functions accept an optional `location_name` parameter (default: "United States"). Most functions also have boolean flags to skip specific sub-analyses (e.g., `include_suggestions=False`).

### Individual API Functions

For granular control, import specific functions from the API modules. See [references/api-reference.md](references/api-reference.md) for the complete list of 25 API functions with parameters, limits, and examples.

## Results Storage

Results auto-save to `results/` with this structure:

```
results/
├── keywords_data/    # Search volume, CPC, competition
├── labs/             # Suggestions, difficulty, intent
├── serp/             # Google/YouTube rankings
├── trends/           # Google Trends data
└── summary/          # Human-readable markdown summaries
```

### Managing Results

```python
from core.storage import list_results, load_result, get_latest_result

# List recent results
files = list_results(category="labs", limit=10)

# Load a specific result
data = load_result(files[0])

# Get most recent result for an operation
latest = get_latest_result(category="labs", operation="keyword_suggestions")
```

### Utility Functions

```python
from main import get_recent_results, load_latest

# List recent files across all categories
files = get_recent_results(limit=10)

# Load latest result for a category
data = load_latest("labs", "keyword_suggestions")
```

## Creating Summaries

After running research, create a markdown summary document in `results/summary/`. Include:

- **Data tables** with volumes, CPC, competition, difficulty
- **Ranked lists** of opportunities (sorted by volume or opportunity score)
- **SERP analysis** showing what currently ranks
- **Recommendations** for content strategy, titles, tags

Name the summary file descriptively (e.g., `results/summary/ai-tools-keyword-research.md`).

## Tips

1. **Be specific** — "Get keyword suggestions for 'AI website builders'" works better than "research AI stuff"
2. **Request summaries** — Always create a summary document after research, named specifically
3. **Batch related keywords** — Pass multiple related keywords at once for comparison
4. **Specify the goal** — "for a YouTube video" vs "for a landing page" changes which data matters most
5. **Ask for competition analysis** — "Show me what videos are ranking" helps identify content gaps

## Defaults

- **Location**: United States (code 2840)
- **Language**: English
- **API Limits**: 700 keywords for volume/overview, 1000 for difficulty/intent, 5 for trends, 200 for keyword ideas


---

## Referenced Files

> The following files are referenced in this skill and included for context.

### references/api-reference.md

```markdown
# API Reference

Complete function reference for all DataForSEO API modules.

## Table of Contents

- [Keywords Data API](#keywords-data-api) (4 functions)
- [Labs API](#labs-api) (9 functions)
- [SERP API](#serp-api) (6 functions)
- [Trends API](#trends-api) (6 functions)

---

## Keywords Data API

Import: `from api.keywords_data import ...`

### `get_search_volume(keywords, location_name, language_name, save)`

Get search volume, CPC, and competition data for keywords.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to analyze (max 700) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with search volume, CPC, and competition for each keyword.

### `get_keywords_for_site(target_domain, location_name, language_name, save)`

Get keywords associated with a specific domain.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `target_domain` | `str` | required | Domain to analyze (e.g., "example.com") |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with keywords relevant to the domain.

### `get_ad_traffic_by_keywords(keywords, location_name, language_name, bid, save)`

Estimate advertising traffic potential for keywords at a given CPC bid.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to analyze |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `bid` | `float` | `2.0` | Maximum CPC bid for estimation |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with traffic estimates for the given bid.

### `get_keywords_for_keywords(keywords, location_name, language_name, save)`

Get keyword expansion ideas from Google Ads Keyword Planner.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Seed keywords (max 20) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with expanded keyword ideas.

---

## Labs API

Import: `from api.labs import ...`

### `get_keyword_overview(keywords, location_name, language_name, include_serp_info, save)`

Comprehensive keyword data: search volume, CPC, competition, and search intent.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to analyze (max 700) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `include_serp_info` | `bool` | `False` | Include SERP features data |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with comprehensive keyword metrics.

### `get_keyword_suggestions(keyword, location_name, language_name, include_seed_keyword, include_serp_info, limit, save)`

Get long-tail keyword suggestions based on a seed keyword.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Seed keyword (min 3 characters) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `include_seed_keyword` | `bool` | `True` | Include seed keyword metrics |
| `include_serp_info` | `bool` | `False` | Include SERP data per keyword |
| `limit` | `int` | `100` | Max results (max 1000) |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with keyword suggestions and metrics.

### `get_keyword_ideas(keywords, location_name, language_name, include_serp_info, closely_variants, limit, save)`

Get keyword ideas from the same category as seed keywords. Goes beyond semantic similarity.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Seed keywords (max 200) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `include_serp_info` | `bool` | `False` | Include SERP data |
| `closely_variants` | `bool` | `False` | Phrase-match (True) vs broad-match (False) |
| `limit` | `int` | `700` | Max results (max 1000) |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with keyword ideas and metrics.

### `get_related_keywords(keyword, location_name, language_name, depth, include_seed_keyword, include_serp_info, limit, save)`

Get keywords from Google's "searches related to" feature using depth-first search.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Seed keyword |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `depth` | `int` | `2` | Search depth 0-4 (4 = max ~4680 results) |
| `include_seed_keyword` | `bool` | `True` | Include seed keyword metrics |
| `include_serp_info` | `bool` | `False` | Include SERP data |
| `limit` | `int` | `100` | Max results (max 1000) |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with related keywords and metrics.

### `get_bulk_keyword_difficulty(keywords, location_name, language_name, save)`

Get keyword difficulty scores (0-100) indicating how hard it is to rank in the top 10.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to analyze (max 1000) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with difficulty scores for each keyword.

### `get_historical_search_volume(keywords, location_name, language_name, include_serp_info, save)`

Get monthly search volume data since 2019.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to analyze (max 700) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `include_serp_info` | `bool` | `False` | Include SERP features |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with historical search volume and monthly breakdowns.

### `get_search_intent(keywords, location_name, language_name, save)`

Classify keywords as informational, navigational, transactional, or commercial.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to classify (max 1000) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with intent classifications for each keyword.

### `get_domain_keywords(target_domain, location_name, language_name, limit, save)`

Get keywords that a domain ranks for in organic search.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `target_domain` | `str` | required | Domain to analyze (e.g., "example.com") |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `limit` | `int` | `100` | Max results |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with keywords the domain ranks for.

### `get_competitors(keywords, location_name, language_name, limit, save)`

Find domains that compete for the same keywords.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to find competitors for |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `limit` | `int` | `20` | Max competitors to return |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with competitor domains and their metrics.

---

## SERP API

Import: `from api.serp import ...`

### `get_google_serp(keyword, location_name, language_name, depth, device, save)`

Get Google organic search results for a keyword.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Search query |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `depth` | `int` | `100` | Number of results (max 700) |
| `device` | `str` | `"desktop"` | `"desktop"` or `"mobile"` |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with SERP data including rankings, URLs, titles, and SERP features.

### `get_youtube_serp(keyword, location_name, language_name, depth, device, save)`

Get YouTube organic search results for a keyword.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Search query (max 700 chars) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `depth` | `int` | `20` | Number of results (max 700, billed per 20) |
| `device` | `str` | `"desktop"` | `"desktop"` or `"mobile"` |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with YouTube video rankings, titles, channels, views.

### `get_google_maps_serp(keyword, location_name, language_name, depth, save)`

Get Google Maps/Local search results.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Search query (e.g., "restaurants near me") |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `depth` | `int` | `20` | Number of results |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with local business listings.

### `get_google_news_serp(keyword, location_name, language_name, depth, save)`

Get Google News search results.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Search query |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `depth` | `int` | `100` | Number of results |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with news articles and rankings.

### `get_google_images_serp(keyword, location_name, language_name, depth, save)`

Get Google Images search results.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Search query |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `depth` | `int` | `100` | Number of results |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with image results including URLs, titles, sources.

### `get_featured_snippet(keyword, location_name, language_name, save)`

Get Google SERP focused on featured snippets and SERP features. Returns top 10 results on desktop.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keyword` | `str` | required | Search query (ideally a question) |
| `location_name` | `str` | "United States" | Target location |
| `language_name` | `str` | "English" | Target language |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with SERP data including featured snippet details.

---

## Trends API

Import: `from api.trends import ...`

### `get_trends_explore(keywords, location_name, search_type, time_range, date_from, date_to, category_code, save)`

Get Google Trends data for keywords.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to compare (max 5) |
| `location_name` | `str` | "United States" | Target location |
| `search_type` | `str` | `"web"` | `"web"`, `"news"`, `"youtube"`, `"images"`, `"froogle"` (shopping) |
| `time_range` | `str` | `"past_12_months"` | `"past_hour"`, `"past_4_hours"`, `"past_day"`, `"past_7_days"`, `"past_month"`, `"past_3_months"`, `"past_12_months"`, `"past_5_years"` |
| `date_from` | `str` | `None` | Custom start date (yyyy-mm-dd), overrides time_range |
| `date_to` | `str` | `None` | Custom end date (yyyy-mm-dd) |
| `category_code` | `int` | `None` | Google Trends category filter |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with trend graphs, regional interest, related topics and queries.

### `get_youtube_trends(keywords, location_name, time_range, save)`

YouTube-specific trend data. Convenience wrapper for `get_trends_explore` with `search_type="youtube"`.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to compare (max 5) |
| `location_name` | `str` | "United States" | Target location |
| `time_range` | `str` | `"past_12_months"` | Time range |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with YouTube trend data.

### `get_news_trends(keywords, location_name, time_range, save)`

Google News trend data. Convenience wrapper for `get_trends_explore` with `search_type="news"`.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to compare (max 5) |
| `location_name` | `str` | "United States" | Target location |
| `time_range` | `str` | `"past_12_months"` | Time range |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with news trend data.

### `get_shopping_trends(keywords, location_name, time_range, save)`

Google Shopping trend data. Convenience wrapper for `get_trends_explore` with `search_type="froogle"`.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to compare (max 5) |
| `location_name` | `str` | "United States" | Target location |
| `time_range` | `str` | `"past_12_months"` | Time range |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with shopping/e-commerce trend data.

### `compare_keyword_trends(keywords, location_name, search_types, time_range, save)`

Compare keyword trends across multiple search platforms.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `keywords` | `List[str]` | required | Keywords to compare (max 5) |
| `location_name` | `str` | "United States" | Target location |
| `search_types` | `List[str]` | `["web", "youtube"]` | Platforms to compare |
| `time_range` | `str` | `"past_12_months"` | Time range |
| `save` | `bool` | `True` | Save individual results |

**Returns:** Dict with search_type keys and trend data values.

### `get_trending_now(location_name, save)`

Get currently trending searches in real-time.

| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `location_name` | `str` | "United States" | Target location |
| `save` | `bool` | `True` | Save results to JSON |

**Returns:** Dict with currently trending searches.

```

### scripts/requirements.txt

```text
dataforseo-client>=1.0.34
python-dotenv>=1.0.0

```

### scripts/main.py

```python
"""
DataForSEO API Toolkit - Main Entry Point

Simple interface for keyword research across YouTube, landing pages, and site pages.
All results are automatically saved to the /results directory with timestamps.

Usage:
    from main import *

    # Quick keyword research
    result = keyword_research("python tutorial")

    # YouTube-specific research
    result = youtube_keyword_research("video editing tips")

    # Full analysis for content planning
    result = full_keyword_analysis(["seo tools", "keyword research"])
"""
import sys
from pathlib import Path
from typing import List, Dict, Any, Optional

# Add current directory to path for imports
sys.path.insert(0, str(Path(__file__).parent))

# Import all API modules
from api.keywords_data import (
    get_search_volume,
    get_keywords_for_site,
    get_ad_traffic_by_keywords,
    get_keywords_for_keywords
)
from api.labs import (
    get_keyword_overview,
    get_keyword_suggestions,
    get_keyword_ideas,
    get_related_keywords,
    get_bulk_keyword_difficulty,
    get_historical_search_volume,
    get_search_intent,
    get_domain_keywords,
    get_competitors
)
from api.serp import (
    get_google_serp,
    get_youtube_serp,
    get_google_maps_serp,
    get_google_news_serp,
    get_google_images_serp,
    get_featured_snippet
)
from api.trends import (
    get_trends_explore,
    get_youtube_trends,
    get_news_trends,
    get_shopping_trends,
    compare_keyword_trends,
    get_trending_now
)
from core.storage import list_results, load_result, get_latest_result


# ============================================================================
# HIGH-LEVEL CONVENIENCE FUNCTIONS
# ============================================================================

def keyword_research(
    keyword: str,
    location_name: str = None,
    include_suggestions: bool = True,
    include_related: bool = True,
    include_difficulty: bool = True
) -> Dict[str, Any]:
    """
    Comprehensive keyword research for a single keyword.

    Performs multiple API calls to gather:
    - Keyword overview (search volume, CPC, competition, search intent)
    - Keyword suggestions (optional)
    - Related keywords (optional)
    - Keyword difficulty (optional)

    Args:
        keyword: The seed keyword to research
        location_name: Target location (default: United States)
        include_suggestions: Include keyword suggestions
        include_related: Include related keywords
        include_difficulty: Include difficulty score

    Returns:
        Dict with keys: overview, suggestions, related, difficulty

    Example:
        >>> result = keyword_research("python programming")
    """
    print(f"\n🔍 Researching keyword: {keyword}")
    results = {}

    # Always get overview
    print("  → Getting keyword overview...")
    results["overview"] = get_keyword_overview(
        keywords=[keyword],
        location_name=location_name
    )

    if include_suggestions:
        print("  → Getting keyword suggestions...")
        results["suggestions"] = get_keyword_suggestions(
            keyword=keyword,
            location_name=location_name,
            limit=50
        )

    if include_related:
        print("  → Getting related keywords...")
        results["related"] = get_related_keywords(
            keyword=keyword,
            location_name=location_name,
            depth=2,
            limit=50
        )

    if include_difficulty:
        print("  → Getting keyword difficulty...")
        results["difficulty"] = get_bulk_keyword_difficulty(
            keywords=[keyword],
            location_name=location_name
        )

    print(f"✅ Research complete for: {keyword}\n")
    return results


def youtube_keyword_research(
    keyword: str,
    location_name: str = None,
    include_serp: bool = True,
    include_trends: bool = True
) -> Dict[str, Any]:
    """
    YouTube-focused keyword research.

    Gathers data specifically useful for YouTube content:
    - Keyword overview with search intent
    - YouTube SERP results (current rankings)
    - YouTube trend data
    - Keyword suggestions

    Args:
        keyword: The keyword to research for YouTube
        location_name: Target location
        include_serp: Include current YouTube rankings
        include_trends: Include YouTube trend data

    Returns:
        Dict with keys: overview, serp, trends, suggestions

    Example:
        >>> result = youtube_keyword_research("video editing tutorial")
    """
    print(f"\n🎬 YouTube keyword research: {keyword}")
    results = {}

    # Keyword overview
    print("  → Getting keyword overview...")
    results["overview"] = get_keyword_overview(
        keywords=[keyword],
        location_name=location_name,
        include_serp_info=True
    )

    # Keyword suggestions
    print("  → Getting keyword suggestions...")
    results["suggestions"] = get_keyword_suggestions(
        keyword=keyword,
        location_name=location_name,
        limit=50
    )

    if include_serp:
        print("  → Getting YouTube rankings...")
        results["youtube_serp"] = get_youtube_serp(
            keyword=keyword,
            location_name=location_name,
            depth=20
        )

    if include_trends:
        print("  → Getting YouTube trends...")
        results["youtube_trends"] = get_youtube_trends(
            keywords=[keyword],
            location_name=location_name
        )

    print(f"✅ YouTube research complete for: {keyword}\n")
    return results


def landing_page_keyword_research(
    keywords: List[str],
    competitor_domain: str = None,
    location_name: str = None
) -> Dict[str, Any]:
    """
    Keyword research for landing page optimization.

    Gathers data useful for landing page SEO:
    - Keyword overview for target keywords
    - Search intent classification
    - Keyword difficulty
    - Google SERP analysis
    - Competitor keywords (if domain provided)

    Args:
        keywords: Target keywords for the landing page
        competitor_domain: Optional competitor domain to analyze
        location_name: Target location

    Returns:
        Dict with comprehensive landing page keyword data

    Example:
        >>> result = landing_page_keyword_research(
        ...     ["best crm software", "crm for small business"],
        ...     competitor_domain="hubspot.com"
        ... )
    """
    print(f"\n📄 Landing page keyword research: {keywords}")
    results = {}

    # Keyword overview
    print("  → Getting keyword overview...")
    results["overview"] = get_keyword_overview(
        keywords=keywords,
        location_name=location_name,
        include_serp_info=True
    )

    # Search intent
    print("  → Getting search intent...")
    results["search_intent"] = get_search_intent(
        keywords=keywords,
        location_name=location_name
    )

    # Difficulty scores
    print("  → Getting keyword difficulty...")
    results["difficulty"] = get_bulk_keyword_difficulty(
        keywords=keywords,
        location_name=location_name
    )

    # SERP analysis for primary keyword
    print("  → Getting SERP analysis...")
    results["serp"] = get_google_serp(
        keyword=keywords[0],
        location_name=location_name
    )

    # Competitor analysis
    if competitor_domain:
        print(f"  → Analyzing competitor: {competitor_domain}...")
        results["competitor_keywords"] = get_keywords_for_site(
            target_domain=competitor_domain,
            location_name=location_name
        )

    print(f"✅ Landing page research complete\n")
    return results


def full_keyword_analysis(
    keywords: List[str],
    location_name: str = None,
    include_historical: bool = True,
    include_trends: bool = True
) -> Dict[str, Any]:
    """
    Full keyword analysis for content strategy.

    Comprehensive analysis including:
    - Keyword overview
    - Historical search volume trends
    - Keyword difficulty
    - Search intent
    - Keyword ideas (expansion)
    - Google Trends data

    Args:
        keywords: Keywords to analyze
        location_name: Target location
        include_historical: Include historical search volume
        include_trends: Include Google Trends data

    Returns:
        Dict with comprehensive keyword analysis

    Example:
        >>> result = full_keyword_analysis(["ai writing tools", "chatgpt alternatives"])
    """
    print(f"\n📊 Full keyword analysis: {keywords}")
    results = {}

    print("  → Getting keyword overview...")
    results["overview"] = get_keyword_overview(
        keywords=keywords,
        location_name=location_name,
        include_serp_info=True
    )

    print("  → Getting keyword difficulty...")
    results["difficulty"] = get_bulk_keyword_difficulty(
        keywords=keywords,
        location_name=location_name
    )

    print("  → Getting search intent...")
    results["search_intent"] = get_search_intent(
        keywords=keywords,
        location_name=location_name
    )

    print("  → Getting keyword ideas...")
    results["keyword_ideas"] = get_keyword_ideas(
        keywords=keywords,
        location_name=location_name,
        limit=100
    )

    if include_historical:
        print("  → Getting historical search volume...")
        results["historical"] = get_historical_search_volume(
            keywords=keywords,
            location_name=location_name
        )

    if include_trends:
        print("  → Getting Google Trends data...")
        results["trends"] = get_trends_explore(
            keywords=keywords[:5],
            location_name=location_name
        )

    print(f"✅ Full analysis complete\n")
    return results


def competitor_analysis(
    domain: str,
    keywords: List[str] = None,
    location_name: str = None
) -> Dict[str, Any]:
    """
    Analyze a competitor's keyword strategy.

    Args:
        domain: Competitor domain to analyze
        keywords: Optional keywords to find competitors for
        location_name: Target location

    Returns:
        Dict with competitor analysis data

    Example:
        >>> result = competitor_analysis("competitor.com")
    """
    print(f"\n🎯 Competitor analysis: {domain}")
    results = {}

    print("  → Getting domain keywords...")
    results["domain_keywords"] = get_domain_keywords(
        target_domain=domain,
        location_name=location_name,
        limit=100
    )

    print("  → Getting keywords from Google Ads data...")
    results["ads_keywords"] = get_keywords_for_site(
        target_domain=domain,
        location_name=location_name
    )

    if keywords:
        print("  → Finding other competitors...")
        results["other_competitors"] = get_competitors(
            keywords=keywords,
            location_name=location_name
        )

    print(f"✅ Competitor analysis complete\n")
    return results


def trending_topics(
    location_name: str = None
) -> Dict[str, Any]:
    """
    Get currently trending topics and searches.

    Args:
        location_name: Target location

    Returns:
        Dict with trending data

    Example:
        >>> result = trending_topics()
    """
    print("\n📈 Getting trending topics...")
    result = get_trending_now(location_name=location_name)
    print("✅ Trending topics retrieved\n")
    return result


# ============================================================================
# UTILITY FUNCTIONS
# ============================================================================

def get_recent_results(category: str = None, limit: int = 10) -> List[Path]:
    """
    Get recently saved results.

    Args:
        category: Filter by category (keywords_data, labs, serp, trends)
        limit: Maximum results to return

    Returns:
        List of result file paths
    """
    return list_results(category=category, limit=limit)


def load_latest(category: str, operation: str = None) -> Optional[Dict]:
    """
    Load the most recent result for a category/operation.

    Args:
        category: Result category
        operation: Specific operation (optional)

    Returns:
        The loaded result data or None
    """
    return get_latest_result(category=category, operation=operation)


# ============================================================================
# QUICK ACCESS - Direct API function exports
# ============================================================================

# For direct access to individual API functions, import from respective modules:
# from api.keywords_data import get_search_volume, get_keywords_for_site
# from api.labs import get_keyword_suggestions, get_bulk_keyword_difficulty
# from api.serp import get_google_serp, get_youtube_serp
# from api.trends import get_trends_explore, get_youtube_trends


if __name__ == "__main__":
    print("""
DataForSEO API Toolkit
======================

High-level functions:
  - keyword_research(keyword)
  - youtube_keyword_research(keyword)
  - landing_page_keyword_research(keywords, competitor_domain)
  - full_keyword_analysis(keywords)
  - competitor_analysis(domain)
  - trending_topics()

Usage:
  from main import *
  result = keyword_research("your keyword here")

All results are automatically saved to /results directory.
""")

```



---

## Skill Companion Files

> Additional files collected from the skill directory layout.

### _meta.json

```json
{
  "owner": "adamkristopher",
  "slug": "seo-dataforseo",
  "displayName": "SEO DataForSEO",
  "latest": {
    "version": "1.0.0",
    "publishedAt": 1769463341114,
    "commit": "https://github.com/clawdbot/skills/commit/2b680fdfcaba49ba10999bec01deb8cdee421123"
  },
  "history": []
}

```

### scripts/api/__init__.py

```python
"""API modules for DataForSEO endpoints."""

```

### scripts/api/keywords_data.py

```python
"""Keywords Data API - Search volume, CPC, and keyword data from Google Ads."""
import sys
from pathlib import Path
from typing import List, Dict, Any, Optional

# Add parent directory to path for imports
sys.path.insert(0, str(Path(__file__).parent.parent))

from dataforseo_client.rest import ApiException

from core.client import get_client
from core.storage import save_result
from config.settings import settings


def get_search_volume(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get search volume, CPC, and competition data for keywords.

    Args:
        keywords: List of keywords to analyze (max 700)
        location_name: Target location (default: United States)
        language_name: Target language (default: English)
        save: Whether to save results to JSON file

    Returns:
        Dict containing search volume data for each keyword

    Example:
        >>> result = get_search_volume(["python tutorial", "learn python"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.keywords_data.google_ads_search_volume_live([{
            "keywords": keywords[:700],
            "location_name": location,
            "language_name": language
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            keyword_preview = keywords[0] if keywords else "bulk"
            save_result(
                result,
                category="keywords_data",
                operation="search_volume",
                keyword=keyword_preview,
                extra_info=f"{len(keywords)}_keywords"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_keywords_for_site(
    target_domain: str,
    location_name: str = None,
    language_name: str = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get keywords associated with a specific domain.

    Args:
        target_domain: Domain to analyze (e.g., "example.com")
        location_name: Target location
        language_name: Target language
        save: Whether to save results

    Returns:
        Dict containing keywords relevant to the domain

    Example:
        >>> result = get_keywords_for_site("competitor.com")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.keywords_data.google_ads_keywords_for_site_live([{
            "target": target_domain,
            "location_name": location,
            "language_name": language
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="keywords_data",
                operation="keywords_for_site",
                keyword=target_domain
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_ad_traffic_by_keywords(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    bid: float = 2.0,
    save: bool = True
) -> Dict[str, Any]:
    """
    Estimate advertising traffic potential for keywords.

    Args:
        keywords: List of keywords to analyze
        location_name: Target location
        language_name: Target language
        bid: Maximum CPC bid for estimation
        save: Whether to save results

    Returns:
        Dict containing traffic estimates

    Example:
        >>> result = get_ad_traffic_by_keywords(["buy shoes online", "best running shoes"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.keywords_data.google_ads_ad_traffic_by_keywords_live([{
            "keywords": keywords,
            "location_name": location,
            "language_name": language,
            "bid": bid
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="keywords_data",
                operation="ad_traffic",
                keyword=keywords[0] if keywords else "bulk"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_keywords_for_keywords(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get keyword expansion ideas from Google Ads Keyword Planner.

    Args:
        keywords: Seed keywords to expand (max 20)
        location_name: Target location
        language_name: Target language
        save: Whether to save results

    Returns:
        Dict containing expanded keyword ideas

    Example:
        >>> result = get_keywords_for_keywords(["video editing", "video software"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.keywords_data.google_ads_keywords_for_keywords_live([{
            "keywords": keywords[:20],
            "location_name": location,
            "language_name": language
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="keywords_data",
                operation="keywords_for_keywords",
                keyword=keywords[0] if keywords else "bulk",
                extra_info=f"{len(keywords)}_seeds"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise

```

### scripts/api/labs.py

```python
"""DataForSEO Labs API - Keyword research, suggestions, difficulty, and competitive analysis."""
import sys
from pathlib import Path
from typing import List, Dict, Any, Optional

# Add parent directory to path for imports
sys.path.insert(0, str(Path(__file__).parent.parent))

from dataforseo_client.rest import ApiException

from core.client import get_client
from core.storage import save_result
from config.settings import settings


def get_keyword_overview(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    include_serp_info: bool = False,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get comprehensive keyword data including search volume, CPC, competition, and search intent.

    Args:
        keywords: List of keywords (max 700)
        location_name: Target location
        language_name: Target language
        include_serp_info: Include SERP features data
        save: Whether to save results

    Returns:
        Dict containing comprehensive keyword metrics

    Example:
        >>> result = get_keyword_overview(["best python courses", "python for beginners"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_keyword_overview_live([{
            "keywords": keywords[:700],
            "location_name": location,
            "language_name": language,
            "include_serp_info": include_serp_info
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="keyword_overview",
                keyword=keywords[0] if keywords else "bulk",
                extra_info=f"{len(keywords)}_keywords"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_keyword_suggestions(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    include_seed_keyword: bool = True,
    include_serp_info: bool = False,
    limit: int = 100,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get keyword suggestions based on a seed keyword.

    Suggestions match the seed with additional words before, after, or within the phrase.

    Args:
        keyword: Seed keyword (min 3 characters)
        location_name: Target location
        language_name: Target language
        include_seed_keyword: Include metrics for the seed keyword
        include_serp_info: Include SERP data for each keyword
        limit: Maximum results (max 1000)
        save: Whether to save results

    Returns:
        Dict containing keyword suggestions with metrics

    Example:
        >>> result = get_keyword_suggestions("python tutorial")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_keyword_suggestions_live([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "include_seed_keyword": include_seed_keyword,
            "include_serp_info": include_serp_info,
            "limit": min(limit, 1000)
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="keyword_suggestions",
                keyword=keyword,
                extra_info=f"limit_{limit}"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_keyword_ideas(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    include_serp_info: bool = False,
    closely_variants: bool = False,
    limit: int = 700,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get keyword ideas that fall into the same category as seed keywords.

    Goes beyond semantic similarity to suggest relevant keywords by mapping
    seed terms against category taxonomies.

    Args:
        keywords: Seed keywords (max 200)
        location_name: Target location
        language_name: Target language
        include_serp_info: Include SERP data
        closely_variants: Use phrase-match (True) vs broad-match (False)
        limit: Maximum results (max 1000)
        save: Whether to save results

    Returns:
        Dict containing keyword ideas with metrics

    Example:
        >>> result = get_keyword_ideas(["youtube marketing", "video seo"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_keyword_ideas_live([{
            "keywords": keywords[:200],
            "location_name": location,
            "language_name": language,
            "include_serp_info": include_serp_info,
            "closely_variants": closely_variants,
            "limit": min(limit, 1000)
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="keyword_ideas",
                keyword=keywords[0] if keywords else "bulk",
                extra_info=f"{len(keywords)}_seeds"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_related_keywords(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    depth: int = 2,
    include_seed_keyword: bool = True,
    include_serp_info: bool = False,
    limit: int = 100,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get related keywords from Google's "searches related to" feature.

    Uses depth-first search algorithm on SERP "related searches" element.

    Args:
        keyword: Seed keyword
        location_name: Target location
        language_name: Target language
        depth: Search depth 0-4 (0=seed only, 4=max ~4680 results)
        include_seed_keyword: Include seed keyword metrics
        include_serp_info: Include SERP data
        limit: Maximum results (max 1000)
        save: Whether to save results

    Returns:
        Dict containing related keywords with metrics

    Example:
        >>> result = get_related_keywords("video editing software", depth=2)
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_related_keywords_live([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "depth": min(depth, 4),
            "include_seed_keyword": include_seed_keyword,
            "include_serp_info": include_serp_info,
            "limit": min(limit, 1000)
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="related_keywords",
                keyword=keyword,
                extra_info=f"depth_{depth}"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_bulk_keyword_difficulty(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get keyword difficulty scores for multiple keywords.

    Difficulty score (0-100) indicates how hard it is to rank in top-10 organic results.

    Args:
        keywords: List of keywords (max 1000)
        location_name: Target location
        language_name: Target language
        save: Whether to save results

    Returns:
        Dict containing keyword difficulty scores

    Example:
        >>> result = get_bulk_keyword_difficulty(["seo tools", "keyword research"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_bulk_keyword_difficulty_live([{
            "keywords": keywords[:1000],
            "location_name": location,
            "language_name": language
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="keyword_difficulty",
                keyword=keywords[0] if keywords else "bulk",
                extra_info=f"{len(keywords)}_keywords"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_historical_search_volume(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    include_serp_info: bool = False,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get historical search volume and trend data for keywords.

    Returns monthly search volume data since 2019.

    Args:
        keywords: List of keywords (max 700)
        location_name: Target location
        language_name: Target language
        include_serp_info: Include SERP features
        save: Whether to save results

    Returns:
        Dict containing historical search volume with monthly breakdowns

    Example:
        >>> result = get_historical_search_volume(["ai tools", "chatgpt"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_historical_search_volume_live([{
            "keywords": keywords[:700],
            "location_name": location,
            "language_name": language,
            "include_serp_info": include_serp_info
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="historical_search_volume",
                keyword=keywords[0] if keywords else "bulk",
                extra_info=f"{len(keywords)}_keywords"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_search_intent(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get search intent classification for keywords.

    Classifies keywords as informational, navigational, transactional, or commercial.

    Args:
        keywords: List of keywords (max 1000)
        location_name: Target location
        language_name: Target language
        save: Whether to save results

    Returns:
        Dict containing search intent classifications

    Example:
        >>> result = get_search_intent(["buy python course", "what is python"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_search_intent_live([{
            "keywords": keywords[:1000],
            "location_name": location,
            "language_name": language
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="search_intent",
                keyword=keywords[0] if keywords else "bulk",
                extra_info=f"{len(keywords)}_keywords"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_domain_keywords(
    target_domain: str,
    location_name: str = None,
    language_name: str = None,
    limit: int = 100,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get keywords that a domain ranks for in organic search.

    Args:
        target_domain: Domain to analyze (e.g., "example.com")
        location_name: Target location
        language_name: Target language
        limit: Maximum results
        save: Whether to save results

    Returns:
        Dict containing keywords the domain ranks for

    Example:
        >>> result = get_domain_keywords("competitor.com")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_ranked_keywords_live([{
            "target": target_domain,
            "location_name": location,
            "language_name": language,
            "limit": limit
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="domain_keywords",
                keyword=target_domain
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_competitors(
    keywords: List[str],
    location_name: str = None,
    language_name: str = None,
    limit: int = 20,
    save: bool = True
) -> Dict[str, Any]:
    """
    Find domains that compete for the same keywords.

    Args:
        keywords: Keywords to find competitors for
        location_name: Target location
        language_name: Target language
        limit: Maximum competitors to return
        save: Whether to save results

    Returns:
        Dict containing competitor domains and their metrics

    Example:
        >>> result = get_competitors(["video editing software", "best video editor"])
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.labs.google_competitors_domain_live([{
            "keywords": keywords,
            "location_name": location,
            "language_name": language,
            "limit": limit
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="labs",
                operation="competitors",
                keyword=keywords[0] if keywords else "bulk",
                extra_info=f"{len(keywords)}_keywords"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise

```

### scripts/api/serp.py

```python
"""SERP API - Google and YouTube search results data."""
import sys
from pathlib import Path
from typing import Dict, Any, Optional

# Add parent directory to path for imports
sys.path.insert(0, str(Path(__file__).parent.parent))

from dataforseo_client.rest import ApiException

from core.client import get_client
from core.storage import save_result
from config.settings import settings


def get_google_serp(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    depth: int = 100,
    device: str = "desktop",
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google organic search results for a keyword.

    Args:
        keyword: Search query
        location_name: Target location
        language_name: Target language
        depth: Number of results (max 700)
        device: Device type ("desktop" or "mobile")
        save: Whether to save results

    Returns:
        Dict containing SERP data with rankings, URLs, titles, and SERP features

    Example:
        >>> result = get_google_serp("best video editing software")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.serp.google_organic_live_advanced([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "depth": min(depth, 700),
            "device": device
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="serp",
                operation="google_organic",
                keyword=keyword,
                extra_info=device
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_youtube_serp(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    depth: int = 20,
    device: str = "desktop",
    save: bool = True
) -> Dict[str, Any]:
    """
    Get YouTube organic search results for a keyword.

    Args:
        keyword: Search query (max 700 characters)
        location_name: Target location
        language_name: Target language
        depth: Number of results (max 700, billed per 20)
        device: Device type ("desktop" or "mobile")
        save: Whether to save results

    Returns:
        Dict containing YouTube video rankings with titles, channels, views, etc.

    Example:
        >>> result = get_youtube_serp("python tutorial for beginners")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.serp.youtube_organic_live_advanced([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "depth": min(depth, 700),
            "device": device
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="serp",
                operation="youtube_organic",
                keyword=keyword,
                extra_info=device
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_google_maps_serp(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    depth: int = 20,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google Maps/Local search results for a keyword.

    Args:
        keyword: Search query (e.g., "restaurants near me")
        location_name: Target location
        language_name: Target language
        depth: Number of results
        save: Whether to save results

    Returns:
        Dict containing local business listings

    Example:
        >>> result = get_google_maps_serp("coffee shops downtown")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.serp.google_maps_live_advanced([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "depth": depth
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="serp",
                operation="google_maps",
                keyword=keyword
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_google_news_serp(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    depth: int = 100,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google News search results for a keyword.

    Args:
        keyword: Search query
        location_name: Target location
        language_name: Target language
        depth: Number of results
        save: Whether to save results

    Returns:
        Dict containing news articles and their rankings

    Example:
        >>> result = get_google_news_serp("artificial intelligence")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.serp.google_news_live_advanced([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "depth": depth
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="serp",
                operation="google_news",
                keyword=keyword
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_google_images_serp(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    depth: int = 100,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google Images search results for a keyword.

    Args:
        keyword: Search query
        location_name: Target location
        language_name: Target language
        depth: Number of results
        save: Whether to save results

    Returns:
        Dict containing image results with URLs, titles, sources

    Example:
        >>> result = get_google_images_serp("python programming logo")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.serp.google_images_live_advanced([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "depth": depth
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="serp",
                operation="google_images",
                keyword=keyword
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_featured_snippet(
    keyword: str,
    location_name: str = None,
    language_name: str = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google SERP with focus on featured snippets and SERP features.

    Args:
        keyword: Search query (ideally a question)
        location_name: Target location
        language_name: Target language
        save: Whether to save results

    Returns:
        Dict containing SERP data with featured snippet details

    Example:
        >>> result = get_featured_snippet("how to edit videos")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME
    language = language_name or settings.DEFAULT_LANGUAGE_NAME

    try:
        response = client.serp.google_organic_live_advanced([{
            "keyword": keyword,
            "location_name": location,
            "language_name": language,
            "depth": 10,
            "device": "desktop"
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="serp",
                operation="featured_snippet",
                keyword=keyword
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise

```

### scripts/api/trends.py

```python
"""Google Trends API - Trend data across Google Search, YouTube, News, Images, and Shopping."""
import sys
from pathlib import Path
from typing import List, Dict, Any, Optional

# Add parent directory to path for imports
sys.path.insert(0, str(Path(__file__).parent.parent))

from dataforseo_client.rest import ApiException

from core.client import get_client
from core.storage import save_result
from config.settings import settings


def get_trends_explore(
    keywords: List[str],
    location_name: str = None,
    search_type: str = "web",
    time_range: str = "past_12_months",
    date_from: str = None,
    date_to: str = None,
    category_code: int = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google Trends data for keywords.

    Args:
        keywords: List of keywords to compare (max 5)
        location_name: Target location (defaults to worldwide if not specified)
        search_type: Type of search - "web", "news", "youtube", "images", "froogle" (shopping)
        time_range: Preset time range - "past_hour", "past_4_hours", "past_day",
                    "past_7_days", "past_month", "past_3_months", "past_12_months",
                    "past_5_years"
        date_from: Custom start date (yyyy-mm-dd), overrides time_range
        date_to: Custom end date (yyyy-mm-dd)
        category_code: Google Trends category filter
        save: Whether to save results

    Returns:
        Dict containing trend graph data, regional interest, related topics and queries

    Example:
        >>> result = get_trends_explore(["python", "javascript"], search_type="youtube")
        >>> result = get_trends_explore(["ai video editing"], time_range="past_12_months")
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME

    request_params = {
        "keywords": keywords[:5],  # API limit
        "location_name": location,
        "type": search_type
    }

    # Add time parameters
    if date_from and date_to:
        request_params["date_from"] = date_from
        request_params["date_to"] = date_to
    else:
        request_params["time_range"] = time_range

    if category_code:
        request_params["category_code"] = category_code

    try:
        response = client.keywords_data.google_trends_explore_live([request_params])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="trends",
                operation="explore",
                keyword="_vs_".join(keywords[:3]),
                extra_info=f"{search_type}_{time_range}"
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise


def get_youtube_trends(
    keywords: List[str],
    location_name: str = None,
    time_range: str = "past_12_months",
    save: bool = True
) -> Dict[str, Any]:
    """
    Get YouTube-specific trend data for keywords.

    Convenience wrapper for get_trends_explore with YouTube search type.

    Args:
        keywords: List of keywords to compare (max 5)
        location_name: Target location
        time_range: Time range for trend data
        save: Whether to save results

    Returns:
        Dict containing YouTube trend data

    Example:
        >>> result = get_youtube_trends(["shorts tutorial", "youtube shorts"])
    """
    return get_trends_explore(
        keywords=keywords,
        location_name=location_name,
        search_type="youtube",
        time_range=time_range,
        save=save
    )


def get_news_trends(
    keywords: List[str],
    location_name: str = None,
    time_range: str = "past_12_months",
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google News trend data for keywords.

    Args:
        keywords: List of keywords to compare (max 5)
        location_name: Target location
        time_range: Time range for trend data
        save: Whether to save results

    Returns:
        Dict containing news trend data

    Example:
        >>> result = get_news_trends(["artificial intelligence", "machine learning"])
    """
    return get_trends_explore(
        keywords=keywords,
        location_name=location_name,
        search_type="news",
        time_range=time_range,
        save=save
    )


def get_shopping_trends(
    keywords: List[str],
    location_name: str = None,
    time_range: str = "past_12_months",
    save: bool = True
) -> Dict[str, Any]:
    """
    Get Google Shopping trend data for keywords.

    Args:
        keywords: List of keywords to compare (max 5)
        location_name: Target location
        time_range: Time range for trend data
        save: Whether to save results

    Returns:
        Dict containing shopping/e-commerce trend data

    Example:
        >>> result = get_shopping_trends(["wireless earbuds", "bluetooth headphones"])
    """
    return get_trends_explore(
        keywords=keywords,
        location_name=location_name,
        search_type="froogle",  # Google Shopping
        time_range=time_range,
        save=save
    )


def compare_keyword_trends(
    keywords: List[str],
    location_name: str = None,
    search_types: List[str] = None,
    time_range: str = "past_12_months",
    save: bool = True
) -> Dict[str, Dict[str, Any]]:
    """
    Compare keyword trends across multiple search types.

    Args:
        keywords: Keywords to compare (max 5)
        location_name: Target location
        search_types: List of search types to compare (defaults to web, youtube)
        time_range: Time range
        save: Whether to save individual results

    Returns:
        Dict with search_type keys and trend data values

    Example:
        >>> result = compare_keyword_trends(
        ...     ["video editing tutorial"],
        ...     search_types=["web", "youtube", "images"]
        ... )
    """
    if search_types is None:
        search_types = ["web", "youtube"]

    results = {}
    for search_type in search_types:
        results[search_type] = get_trends_explore(
            keywords=keywords,
            location_name=location_name,
            search_type=search_type,
            time_range=time_range,
            save=save
        )

    return results


def get_trending_now(
    location_name: str = None,
    save: bool = True
) -> Dict[str, Any]:
    """
    Get currently trending searches.

    Args:
        location_name: Target location
        save: Whether to save results

    Returns:
        Dict containing trending searches

    Example:
        >>> result = get_trending_now()
    """
    client = get_client()
    location = location_name or settings.DEFAULT_LOCATION_NAME

    try:
        response = client.keywords_data.google_trends_trending_now_live([{
            "location_name": location
        }])

        result = response.to_dict() if hasattr(response, 'to_dict') else response

        if save:
            save_result(
                result,
                category="trends",
                operation="trending_now",
                keyword=location
            )

        return result

    except ApiException as e:
        print(f"API Exception: {e}")
        raise

```

### scripts/config/__init__.py

```python
"""Configuration module for DataForSEO toolkit."""
from .settings import settings

__all__ = ["settings"]

```

### scripts/config/settings.py

```python
"""Configuration management for DataForSEO toolkit."""
import os
from pathlib import Path
from dotenv import load_dotenv

# Load environment variables from .env file in current working directory
load_dotenv()


class Settings:
    """Application settings loaded from environment."""

    # Authentication
    DATAFORSEO_LOGIN: str = os.getenv("DATAFORSEO_LOGIN", "")
    DATAFORSEO_PASSWORD: str = os.getenv("DATAFORSEO_PASSWORD", "")

    # Default location/language settings
    DEFAULT_LOCATION_NAME: str = "United States"
    DEFAULT_LOCATION_CODE: int = 2840
    DEFAULT_LANGUAGE_NAME: str = "English"
    DEFAULT_LANGUAGE_CODE: str = "en"

    # Results storage (saves to current working directory)
    RESULTS_DIR: Path = Path.cwd() / "results"

    # API limits (for reference)
    MAX_KEYWORDS_SEARCH_VOLUME: int = 700
    MAX_KEYWORDS_OVERVIEW: int = 700
    MAX_KEYWORDS_DIFFICULTY: int = 1000
    MAX_KEYWORDS_IDEAS: int = 200
    MAX_TRENDS_KEYWORDS: int = 5

    @classmethod
    def validate(cls) -> bool:
        """Validate required settings are present."""
        if not cls.DATAFORSEO_LOGIN or not cls.DATAFORSEO_PASSWORD:
            raise ValueError(
                "DATAFORSEO_LOGIN and DATAFORSEO_PASSWORD must be set in .env file. "
                "Get your credentials from https://app.dataforseo.com/api-access"
            )
        return True


settings = Settings()

```

### scripts/core/__init__.py

```python
"""Core module with client and storage utilities."""
from .client import get_client
from .storage import save_result, load_result, list_results

__all__ = ["get_client", "save_result", "load_result", "list_results"]

```

### scripts/core/client.py

```python
"""DataForSEO API client initialization."""
import sys
from pathlib import Path

# Add parent directory to path for imports
sys.path.insert(0, str(Path(__file__).parent.parent))

from dataforseo_client import configuration as dfs_config
from dataforseo_client import api_client as dfs_api_provider
from dataforseo_client.api.serp_api import SerpApi
from dataforseo_client.api.keywords_data_api import KeywordsDataApi
from dataforseo_client.api.dataforseo_labs_api import DataforseoLabsApi

from config.settings import settings


class DataForSEOClient:
    """Singleton client manager for DataForSEO APIs."""

    _instance = None
    _api_client = None
    _configuration = None

    def __new__(cls):
        if cls._instance is None:
            cls._instance = super().__new__(cls)
            cls._instance._initialize()
        return cls._instance

    def _initialize(self):
        """Initialize the API client with credentials."""
        settings.validate()
        self._configuration = dfs_config.Configuration(
            username=settings.DATAFORSEO_LOGIN,
            password=settings.DATAFORSEO_PASSWORD
        )
        self._api_client = dfs_api_provider.ApiClient(self._configuration)

    @property
    def serp(self) -> SerpApi:
        """Get SERP API instance."""
        return SerpApi(self._api_client)

    @property
    def keywords_data(self) -> KeywordsDataApi:
        """Get Keywords Data API instance."""
        return KeywordsDataApi(self._api_client)

    @property
    def labs(self) -> DataforseoLabsApi:
        """Get DataForSEO Labs API instance."""
        return DataforseoLabsApi(self._api_client)

    @property
    def api_client(self):
        """Get raw API client for custom requests."""
        return self._api_client

    def close(self):
        """Close the API client connection."""
        if self._api_client:
            self._api_client.close()


def get_client() -> DataForSEOClient:
    """Get or create the DataForSEO client instance."""
    return DataForSEOClient()

```

### scripts/core/storage.py

```python
"""Result storage utilities for persisting API responses."""
import json
import sys
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, List, Optional

# Add parent directory to path for imports
sys.path.insert(0, str(Path(__file__).parent.parent))

from config.settings import settings


def get_timestamp() -> str:
    """Generate timestamp for filenames."""
    return datetime.now().strftime("%Y%m%d_%H%M%S")


def sanitize_filename(name: str) -> str:
    """Sanitize string for use as filename."""
    for char in ['/', '\\', ':', '*', '?', '"', '<', '>', '|', ' ']:
        name = name.replace(char, '_')
    return name[:100]  # Limit length


def save_result(
    data: Any,
    category: str,
    operation: str,
    keyword: Optional[str] = None,
    extra_info: Optional[str] = None
) -> Path:
    """
    Save API result to JSON file.

    Args:
        data: The API response data to save
        category: Result category (keywords_data, labs, serp, trends)
        operation: Specific operation name (e.g., search_volume, keyword_suggestions)
        keyword: Primary keyword(s) used in the request
        extra_info: Additional info for filename

    Returns:
        Path to the saved file
    """
    # Ensure results directory exists
    category_dir = settings.RESULTS_DIR / category
    category_dir.mkdir(parents=True, exist_ok=True)

    # Build filename
    timestamp = get_timestamp()
    parts = [timestamp, operation]

    if keyword:
        parts.append(sanitize_filename(keyword))
    if extra_info:
        parts.append(sanitize_filename(extra_info))

    filename = "__".join(parts) + ".json"
    filepath = category_dir / filename

    # Prepare data for JSON serialization
    if hasattr(data, 'to_dict'):
        data = data.to_dict()

    # Save with metadata wrapper
    result_wrapper = {
        "metadata": {
            "saved_at": datetime.now().isoformat(),
            "category": category,
            "operation": operation,
            "keyword": keyword,
            "extra_info": extra_info
        },
        "data": data
    }

    with open(filepath, 'w', encoding='utf-8') as f:
        json.dump(result_wrapper, f, indent=2, ensure_ascii=False, default=str)

    print(f"Results saved to: {filepath}")
    return filepath


def load_result(filepath: Path) -> Dict[str, Any]:
    """Load a previously saved result."""
    with open(filepath, 'r', encoding='utf-8') as f:
        return json.load(f)


def list_results(
    category: Optional[str] = None,
    operation: Optional[str] = None,
    limit: int = 50
) -> List[Path]:
    """
    List saved result files, optionally filtered by category/operation.

    Args:
        category: Filter by category (keywords_data, labs, serp, trends)
        operation: Filter by operation name
        limit: Maximum files to return

    Returns:
        List of file paths, sorted by most recent first
    """
    base_dir = settings.RESULTS_DIR

    if category:
        base_dir = base_dir / category

    if not base_dir.exists():
        return []

    pattern = f"*{operation}*" if operation else "*"
    files = sorted(base_dir.glob(f"**/{pattern}.json"), reverse=True)
    return files[:limit]


def get_latest_result(category: str, operation: Optional[str] = None) -> Optional[Dict]:
    """
    Get the most recent result for a category/operation.

    Args:
        category: Result category
        operation: Specific operation (optional)

    Returns:
        The loaded result data or None
    """
    files = list_results(category=category, operation=operation, limit=1)
    if files:
        return load_result(files[0])
    return None

```

### scripts/setup.sh

```bash
#!/bin/bash
pip install -r "$(dirname "$0")/requirements.txt"

```

seo-dataforseo | SkillHub