alicloud-platform-openapi-product-api-discovery
Discover and reconcile Alibaba Cloud product catalogs from Ticket System, Support & Service, and BSS OpenAPI; fetch OpenAPI product/version/API metadata; and summarize API coverage to plan new skills. Use when you need a complete product list, product-to-API mapping, or coverage/gap reports for skill generation.
Packaged view
This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.
Install command
npx @skill-hub/cli install openclaw-skills-alicloud-platform-openapi-product-api-discovery
Repository
Skill path: skills/cinience/alicloud-platform-openapi-product-api-discovery
Discover and reconcile Alibaba Cloud product catalogs from Ticket System, Support & Service, and BSS OpenAPI; fetch OpenAPI product/version/API metadata; and summarize API coverage to plan new skills. Use when you need a complete product list, product-to-API mapping, or coverage/gap reports for skill generation.
Open repositoryBest for
Primary workflow: Research & Ops.
Technical facets: Full Stack, Backend.
Target audience: everyone.
License: Unknown.
Original source
Catalog source: SkillHub Club.
Repository owner: openclaw.
This is still a mirrored public skill entry. Review the repository before installing into production workflows.
What it helps with
- Install alicloud-platform-openapi-product-api-discovery into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
- Review https://github.com/openclaw/skills before adding alicloud-platform-openapi-product-api-discovery to shared team environments
- Use alicloud-platform-openapi-product-api-discovery for development workflows
Works across
Favorites: 0.
Sub-skills: 0.
Aggregator: No.
Original source / Raw SKILL.md
---
name: alicloud-platform-openapi-product-api-discovery
description: Discover and reconcile Alibaba Cloud product catalogs from Ticket System, Support & Service, and BSS OpenAPI; fetch OpenAPI product/version/API metadata; and summarize API coverage to plan new skills. Use when you need a complete product list, product-to-API mapping, or coverage/gap reports for skill generation.
version: 1.0.0
---
# Alibaba Cloud Product + API Discovery
Follow this workflow to collect products, resolve API metadata, and build summaries for skill planning.
## Workflow
1) Fetch product lists from the three sources
- Ticket System (ListProducts)
- Support & Service (ListProductByGroup)
- BSS OpenAPI (QueryProductList)
Run the bundled scripts (from this skill folder):
```bash
python scripts/products_from_ticket_system.py
python scripts/products_from_support_service.py
python scripts/products_from_bssopenapi.py
```
Provide required env vars in each script (see references).
2) Merge product lists
```bash
python scripts/merge_product_sources.py
```
This writes `output/product-scan/merged_products.json` and `.md`.
3) Fetch OpenAPI metadata product list
```bash
python scripts/products_from_openapi_meta.py
```
This writes `output/product-scan/openapi-meta/products.json` and `products_normalized.json`.
4) Fetch OpenAPI API docs per product/version
```bash
python scripts/apis_from_openapi_meta.py
```
By default this can be large. Use filters for dry runs:
- `OPENAPI_META_MAX_PRODUCTS=10`
- `OPENAPI_META_PRODUCTS=Ecs,Ons`
- `OPENAPI_META_VERSIONS=2014-05-26`
5) Join products with API counts
```bash
python scripts/join_products_with_api_meta.py
```
6) Summarize products by category/group
```bash
python scripts/summarize_openapi_meta_products.py
```
6) (Optional) Compare products vs existing skills
```bash
python scripts/analyze_products_vs_skills.py
```
## Output discipline
All generated files must go under `output/`. Do not place temporary files elsewhere.
## Validation
```bash
mkdir -p output/alicloud-platform-openapi-product-api-discovery
for f in skills/platform/openapi/alicloud-platform-openapi-product-api-discovery/scripts/*.py; do
python3 -m py_compile "$f"
done
echo "py_compile_ok" > output/alicloud-platform-openapi-product-api-discovery/validate.txt
```
Pass criteria: command exits 0 and `output/alicloud-platform-openapi-product-api-discovery/validate.txt` is generated.
## Output And Evidence
- Save artifacts, command outputs, and API response summaries under `output/alicloud-platform-openapi-product-api-discovery/`.
- Include key parameters (region/resource id/time range) in evidence files for reproducibility.
## Prerequisites
- Configure least-privilege Alibaba Cloud credentials before execution.
- Prefer environment variables: `ALICLOUD_ACCESS_KEY_ID`, `ALICLOUD_ACCESS_KEY_SECRET`, optional `ALICLOUD_REGION_ID`.
- If region is unclear, ask the user before running mutating operations.
## References
- Product source APIs: see `references/product-sources.md`
- OpenAPI meta endpoints: see `references/openapi-meta.md`
---
## Referenced Files
> The following files are referenced in this skill and included for context.
### scripts/products_from_ticket_system.py
```python
#!/usr/bin/env python3
"""Fetch products via Ticket System ListProducts API.
Requires:
- ALICLOUD_ACCESS_KEY_ID
- ALICLOUD_ACCESS_KEY_SECRET
- TICKET_ENDPOINT (e.g. <product_code>.<region>.aliyuncs.com)
Optional:
- ALICLOUD_SECURITY_TOKEN / ALIBABA_CLOUD_SECURITY_TOKEN (STS session token)
- TICKET_VERSION (default: 2021-06-10)
- TICKET_LANGUAGE (zh|en|jp)
- TICKET_NAME (fuzzy name filter)
- OUTPUT_DIR (default: output)
"""
import json
import os
import sys
from pathlib import Path
def get_env(name: str, default: str | None = None) -> str:
value = os.getenv(name, default)
if not value:
print(f"Missing env var: {name}", file=sys.stderr)
sys.exit(1)
return value
def main() -> None:
try:
from aliyunsdkcore.client import AcsClient
from aliyunsdkcore.request import CommonRequest
except Exception:
print("Missing SDK. Install: pip install aliyun-python-sdk-core", file=sys.stderr)
sys.exit(1)
access_key_id = get_env("ALICLOUD_ACCESS_KEY_ID")
access_key_secret = get_env("ALICLOUD_ACCESS_KEY_SECRET")
security_token = os.getenv("ALICLOUD_SECURITY_TOKEN") or os.getenv("ALIBABA_CLOUD_SECURITY_TOKEN")
endpoint = get_env("TICKET_ENDPOINT")
version = os.getenv("TICKET_VERSION", "2021-06-10")
client = AcsClient(access_key_id, access_key_secret, "cn-hangzhou", security_token)
request = CommonRequest()
request.set_domain(endpoint)
request.set_version(version)
request.set_action_name("ListProducts")
request.set_method("GET")
name = os.getenv("TICKET_NAME")
language = os.getenv("TICKET_LANGUAGE")
if name:
request.add_query_param("Name", name)
if language:
request.add_query_param("Language", language)
response = client.do_action_with_exception(request)
data = json.loads(response.decode("utf-8"))
output_dir = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan" / "ticket-system"
output_dir.mkdir(parents=True, exist_ok=True)
out_file = output_dir / "products.json"
out_file.write_text(json.dumps(data, ensure_ascii=False, indent=2), encoding="utf-8")
print(f"Saved: {out_file}")
if __name__ == "__main__":
main()
```
### scripts/products_from_support_service.py
```python
#!/usr/bin/env python3
"""Fetch products via Support ListProductByGroup API.
Requires:
- ALICLOUD_ACCESS_KEY_ID
- ALICLOUD_ACCESS_KEY_SECRET
- SUPPORT_ENDPOINT (e.g. <product_code>.<region>.aliyuncs.com)
- SUPPORT_VERSION (API version for Support service)
- SUPPORT_GROUP_ID (OpenGroupId)
Optional:
- ALICLOUD_SECURITY_TOKEN / ALIBABA_CLOUD_SECURITY_TOKEN (STS session token)
- OUTPUT_DIR (default: output)
"""
from __future__ import annotations
import json
import os
import sys
from pathlib import Path
def get_env(name: str, default: str | None = None) -> str:
value = os.getenv(name, default)
if not value:
print(f"Missing env var: {name}", file=sys.stderr)
sys.exit(1)
return value
def parse_data_field(raw: str | list | dict) -> list[dict]:
if isinstance(raw, list):
return raw
if isinstance(raw, dict):
return [raw]
if isinstance(raw, str):
raw = raw.strip()
if not raw:
return []
try:
parsed = json.loads(raw)
if isinstance(parsed, list):
return parsed
if isinstance(parsed, dict):
return [parsed]
except json.JSONDecodeError:
return []
return []
def main() -> None:
try:
from aliyunsdkcore.client import AcsClient
from aliyunsdkcore.request import CommonRequest
except Exception:
print("Missing SDK. Install: pip install aliyun-python-sdk-core", file=sys.stderr)
sys.exit(1)
access_key_id = get_env("ALICLOUD_ACCESS_KEY_ID")
access_key_secret = get_env("ALICLOUD_ACCESS_KEY_SECRET")
security_token = os.getenv("ALICLOUD_SECURITY_TOKEN") or os.getenv("ALIBABA_CLOUD_SECURITY_TOKEN")
endpoint = get_env("SUPPORT_ENDPOINT")
version = get_env("SUPPORT_VERSION")
group_id = get_env("SUPPORT_GROUP_ID")
client = AcsClient(access_key_id, access_key_secret, "cn-hangzhou", security_token)
request = CommonRequest()
request.set_domain(endpoint)
request.set_version(version)
request.set_action_name("ListProductByGroup")
request.set_method("GET")
request.add_query_param("OpenGroupId", group_id)
response = client.do_action_with_exception(request)
data = json.loads(response.decode("utf-8"))
normalized = parse_data_field(data.get("Data"))
data["DataParsed"] = normalized
output_dir = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan" / "support-service"
output_dir.mkdir(parents=True, exist_ok=True)
out_file = output_dir / "products.json"
out_file.write_text(json.dumps(data, ensure_ascii=False, indent=2), encoding="utf-8")
print(f"Saved: {out_file}")
if __name__ == "__main__":
main()
```
### scripts/products_from_bssopenapi.py
```python
#!/usr/bin/env python3
"""Fetch products via BssOpenApi QueryProductList API.
Requires:
- ALICLOUD_ACCESS_KEY_ID
- ALICLOUD_ACCESS_KEY_SECRET
Optional:
- ALICLOUD_SECURITY_TOKEN / ALIBABA_CLOUD_SECURITY_TOKEN (STS session token)
- BSS_ENDPOINT (default: business.aliyuncs.com)
- BSS_VERSION (default: 2017-12-14)
- BSS_PAGE_SIZE (default: 50)
- OUTPUT_DIR (default: output)
"""
from __future__ import annotations
import json
import os
import sys
from pathlib import Path
def get_int(name: str, default: int) -> int:
value = os.getenv(name)
if not value:
return default
try:
return int(value)
except ValueError:
print(f"Invalid int env var {name}: {value}", file=sys.stderr)
sys.exit(1)
def main() -> None:
try:
from aliyunsdkcore.client import AcsClient
from aliyunsdkcore.request import CommonRequest
except Exception:
print("Missing SDK. Install: pip install aliyun-python-sdk-core", file=sys.stderr)
sys.exit(1)
access_key_id = os.getenv("ALICLOUD_ACCESS_KEY_ID")
access_key_secret = os.getenv("ALICLOUD_ACCESS_KEY_SECRET")
security_token = os.getenv("ALICLOUD_SECURITY_TOKEN") or os.getenv("ALIBABA_CLOUD_SECURITY_TOKEN")
if not access_key_id or not access_key_secret:
print("Missing ALICLOUD_ACCESS_KEY_ID or ALICLOUD_ACCESS_KEY_SECRET", file=sys.stderr)
sys.exit(1)
endpoint = os.getenv("BSS_ENDPOINT", "business.aliyuncs.com")
version = os.getenv("BSS_VERSION", "2017-12-14")
page_size = get_int("BSS_PAGE_SIZE", 50)
client = AcsClient(access_key_id, access_key_secret, "cn-hangzhou", security_token)
products: list[dict] = []
page_num = 1
total_count = None
while True:
request = CommonRequest()
request.set_domain(endpoint)
request.set_version(version)
request.set_action_name("QueryProductList")
request.set_method("GET")
request.add_query_param("PageNum", page_num)
request.add_query_param("PageSize", page_size)
request.add_query_param("QueryTotalCount", True)
response = client.do_action_with_exception(request)
payload = json.loads(response.decode("utf-8"))
data = payload.get("Data") or {}
if total_count is None:
total_count = data.get("TotalCount")
product_list = data.get("ProductList") or {}
product_items = product_list.get("Product") or []
if isinstance(product_items, dict):
product_items = [product_items]
if not product_items:
break
products.extend(product_items)
if total_count is not None:
if len(products) >= int(total_count):
break
page_num += 1
output_dir = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan" / "bssopenapi"
output_dir.mkdir(parents=True, exist_ok=True)
out_file = output_dir / "products.json"
out_file.write_text(
json.dumps({"TotalCount": total_count, "Products": products}, ensure_ascii=False, indent=2),
encoding="utf-8",
)
print(f"Saved: {out_file}")
if __name__ == "__main__":
main()
```
### scripts/merge_product_sources.py
```python
#!/usr/bin/env python3
"""Merge product outputs from ticket system, support service, and BSS OpenAPI."""
from __future__ import annotations
import json
import os
from pathlib import Path
BASE_DIR = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan"
def load_json(path: Path) -> dict:
if not path.exists():
return {}
return json.loads(path.read_text(encoding="utf-8"))
def normalize_ticket(data: dict) -> list[dict]:
results: list[dict] = []
directories = data.get("Data") or []
for directory in directories:
directory_name = directory.get("DirectoryName")
directory_id = directory.get("DirectoryId")
product_list = directory.get("ProductList") or []
for product in product_list:
results.append(
{
"source": "ticket-system",
"product_name": product.get("ProductName"),
"product_id": product.get("ProductId"),
"directory_name": directory_name,
"directory_id": directory_id,
}
)
return results
def normalize_support(data: dict) -> list[dict]:
results: list[dict] = []
products = data.get("DataParsed") or []
for product in products:
results.append(
{
"source": "support-service",
"product_name": product.get("ProductName") or product.get("productName"),
"product_code": product.get("ProductCode") or product.get("productCode"),
}
)
return results
def normalize_bss(data: dict) -> list[dict]:
results: list[dict] = []
products = data.get("Products") or []
for product in products:
results.append(
{
"source": "bssopenapi",
"product_name": product.get("ProductName"),
"product_code": product.get("ProductCode"),
"product_type": product.get("ProductType"),
"subscription_type": product.get("SubscriptionType"),
}
)
return results
def normalize_key(item: dict) -> str:
product_code = item.get("product_code")
product_name = item.get("product_name")
if product_code:
return str(product_code).strip().lower()
if product_name:
return str(product_name).strip().lower()
return ""
def main() -> None:
ticket_data = load_json(BASE_DIR / "ticket-system" / "products.json")
support_data = load_json(BASE_DIR / "support-service" / "products.json")
bss_data = load_json(BASE_DIR / "bssopenapi" / "products.json")
merged = normalize_ticket(ticket_data) + normalize_support(support_data) + normalize_bss(bss_data)
deduped: dict[str, dict] = {}
for item in merged:
key = normalize_key(item)
if not key:
key = f"{item.get('source','unknown')}:{len(deduped)}"
if key not in deduped:
deduped[key] = item
deduped[key]["sources"] = [item.get("source")]
continue
existing = deduped[key]
sources = existing.get("sources", [])
if item.get("source") not in sources:
sources.append(item.get("source"))
for field, value in item.items():
if field in ("source", "sources"):
continue
if not existing.get(field) and value:
existing[field] = value
out = {
"counts": {
"ticket-system": len(normalize_ticket(ticket_data)),
"support-service": len(normalize_support(support_data)),
"bssopenapi": len(normalize_bss(bss_data)),
"merged_unique": len(deduped),
},
"products": sorted(deduped.values(), key=lambda x: (x.get("product_name") or "")),
}
BASE_DIR.mkdir(parents=True, exist_ok=True)
out_file = BASE_DIR / "merged_products.json"
out_file.write_text(json.dumps(out, ensure_ascii=False, indent=2), encoding="utf-8")
md_lines = [
"# 产品汇总", "",
f"- 工单系统: {out['counts']['ticket-system']}",
f"- 支持与服务: {out['counts']['support-service']}",
f"- 费用与成本: {out['counts']['bssopenapi']}",
f"- 去重后: {out['counts']['merged_unique']}",
"",
"| 产品名 | 产品Code | 来源 |", "| --- | --- | --- |",
]
for item in out["products"]:
name = item.get("product_name") or ""
code = item.get("product_code") or ""
sources = ", ".join(item.get("sources", []))
md_lines.append(f"| {name} | {code} | {sources} |")
md_file = BASE_DIR / "merged_products.md"
md_file.write_text("\n".join(md_lines), encoding="utf-8")
print(f"Saved: {out_file}")
print(f"Saved: {md_file}")
if __name__ == "__main__":
main()
```
### scripts/products_from_openapi_meta.py
```python
#!/usr/bin/env python3
"""Fetch Alibaba Cloud product list from OpenAPI metadata endpoints.
Downloads:
https://api.aliyun.com/meta/v1/products.json?language=EN_US
Optional env vars:
- OPENAPI_META_LANGUAGE (default: EN_US)
- OUTPUT_DIR (default: output)
"""
from __future__ import annotations
import json
import os
import sys
import urllib.error
import urllib.request
from pathlib import Path
def fetch_json(url: str) -> dict:
try:
with urllib.request.urlopen(url, timeout=60) as resp:
payload = resp.read().decode("utf-8")
except urllib.error.URLError as exc:
print(f"Failed to fetch {url}: {exc}", file=sys.stderr)
sys.exit(1)
return json.loads(payload)
def main() -> None:
language = os.getenv("OPENAPI_META_LANGUAGE", "EN_US")
url = f"https://api.aliyun.com/meta/v1/products.json?language={language}"
data = fetch_json(url)
output_dir = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan" / "openapi-meta"
output_dir.mkdir(parents=True, exist_ok=True)
raw_file = output_dir / "products.json"
raw_file.write_text(json.dumps(data, ensure_ascii=False, indent=2), encoding="utf-8")
if isinstance(data, list):
products = data
else:
products = data.get("products") or data.get("Products") or []
normalized = []
for product in products:
product_code = product.get("code") or product.get("product") or product.get("Product")
versions = product.get("versions") or product.get("Versions") or []
normalized.append({"product_code": product_code, "versions": versions})
normalized_file = output_dir / "products_normalized.json"
normalized_file.write_text(
json.dumps({"language": language, "products": normalized}, ensure_ascii=False, indent=2),
encoding="utf-8",
)
print(f"Saved: {raw_file}")
print(f"Saved: {normalized_file}")
if __name__ == "__main__":
main()
```
### scripts/apis_from_openapi_meta.py
```python
#!/usr/bin/env python3
"""Fetch API list/details from OpenAPI metadata for product/version pairs.
Requires a products list from products_from_openapi_meta.py.
Optional env vars:
- OPENAPI_META_PRODUCTS_FILE (default: output/product-scan/openapi-meta/products_normalized.json)
- OPENAPI_META_OUTPUT_DIR (default: output/product-scan/openapi-meta/apis)
- OPENAPI_META_PRODUCTS (comma-separated product codes to include)
- OPENAPI_META_VERSIONS (comma-separated versions to include)
- OPENAPI_META_MAX_PRODUCTS (limit number of products to fetch)
- OPENAPI_META_SLEEP_SECONDS (default: 0.2)
"""
from __future__ import annotations
import json
import os
import sys
import time
import urllib.error
import urllib.request
from pathlib import Path
def fetch_json(url: str) -> dict:
try:
with urllib.request.urlopen(url, timeout=60) as resp:
payload = resp.read().decode("utf-8")
except urllib.error.HTTPError as exc:
print(f"HTTP error {exc.code} for {url}", file=sys.stderr)
return {}
except urllib.error.URLError as exc:
print(f"Failed to fetch {url}: {exc}", file=sys.stderr)
return {}
try:
return json.loads(payload)
except json.JSONDecodeError:
print(f"Invalid JSON for {url}", file=sys.stderr)
return {}
def parse_list(value: str | None) -> set[str]:
if not value:
return set()
return {item.strip() for item in value.split(",") if item.strip()}
def main() -> None:
products_file = Path(
os.getenv(
"OPENAPI_META_PRODUCTS_FILE",
"output/product-scan/openapi-meta/products_normalized.json",
)
)
if not products_file.exists():
print(f"Missing products file: {products_file}", file=sys.stderr)
sys.exit(1)
output_dir = Path(
os.getenv("OPENAPI_META_OUTPUT_DIR", "output/product-scan/openapi-meta/apis")
)
output_dir.mkdir(parents=True, exist_ok=True)
include_products = parse_list(os.getenv("OPENAPI_META_PRODUCTS"))
include_versions = parse_list(os.getenv("OPENAPI_META_VERSIONS"))
max_products = os.getenv("OPENAPI_META_MAX_PRODUCTS")
sleep_seconds = float(os.getenv("OPENAPI_META_SLEEP_SECONDS", "0.2"))
data = json.loads(products_file.read_text(encoding="utf-8"))
products = data.get("products") or []
summary = []
processed = 0
for product in products:
product_code = product.get("product_code")
if not product_code:
continue
if include_products and product_code not in include_products:
continue
versions = product.get("versions") or []
if include_versions:
versions = [v for v in versions if v in include_versions]
for version in versions:
url = (
"https://api.aliyun.com/meta/v1/products/"
f"{product_code}/versions/{version}/api-docs.json"
)
payload = fetch_json(url)
if not payload:
continue
api_count = len(payload.get("apis") or [])
product_dir = output_dir / product_code / version
product_dir.mkdir(parents=True, exist_ok=True)
out_file = product_dir / "api-docs.json"
out_file.write_text(json.dumps(payload, ensure_ascii=False, indent=2), encoding="utf-8")
summary.append(
{
"product_code": product_code,
"version": version,
"api_count": api_count,
"api_docs_path": str(out_file),
}
)
time.sleep(sleep_seconds)
processed += 1
if max_products and processed >= int(max_products):
break
summary_file = output_dir / "summary.json"
summary_file.write_text(json.dumps({"items": summary}, ensure_ascii=False, indent=2), encoding="utf-8")
print(f"Saved: {summary_file}")
if __name__ == "__main__":
main()
```
### scripts/join_products_with_api_meta.py
```python
#!/usr/bin/env python3
"""Join merged product list with OpenAPI meta API counts."""
from __future__ import annotations
import json
import os
import sys
from pathlib import Path
def main() -> None:
base_dir = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan"
merged_file = base_dir / "merged_products.json"
api_summary_file = base_dir / "openapi-meta" / "apis" / "summary.json"
if not merged_file.exists():
print("Missing merged_products.json. Run merge_product_sources.py first.", file=sys.stderr)
sys.exit(1)
if not api_summary_file.exists():
print("Missing api summary. Run apis_from_openapi_meta.py first.", file=sys.stderr)
sys.exit(1)
merged = json.loads(merged_file.read_text(encoding="utf-8"))
summary = json.loads(api_summary_file.read_text(encoding="utf-8"))
api_counts = {}
for item in summary.get("items") or []:
key = item.get("product_code")
if not key:
continue
api_counts.setdefault(key, 0)
api_counts[key] += int(item.get("api_count") or 0)
items = []
for product in merged.get("products") or []:
code = product.get("product_code") or ""
api_count = api_counts.get(code, 0)
entry = dict(product)
entry["api_count"] = api_count
items.append(entry)
out_file = base_dir / "products_with_api_counts.json"
out_file.write_text(
json.dumps({"items": items, "api_counts": api_counts}, ensure_ascii=False, indent=2),
encoding="utf-8",
)
md_lines = [
"# 产品 + API 统计", "",
"| 产品名 | 产品Code | API 数量 | 来源 |", "| --- | --- | --- | --- |",
]
for item in items:
name = item.get("product_name") or ""
code = item.get("product_code") or ""
count = item.get("api_count") or 0
sources = ", ".join(item.get("sources") or [])
md_lines.append(f"| {name} | {code} | {count} | {sources} |")
md_file = base_dir / "products_with_api_counts.md"
md_file.write_text("\n".join(md_lines), encoding="utf-8")
print(f"Saved: {out_file}")
print(f"Saved: {md_file}")
if __name__ == "__main__":
main()
```
### scripts/summarize_openapi_meta_products.py
```python
#!/usr/bin/env python3
"""Summarize OpenAPI meta products by category/group."""
from __future__ import annotations
import json
import os
from collections import defaultdict
from pathlib import Path
def main() -> None:
base_dir = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan" / "openapi-meta"
products_file = base_dir / "products.json"
if not products_file.exists():
raise SystemExit("Missing products.json. Run products_from_openapi_meta.py first.")
data = json.loads(products_file.read_text(encoding="utf-8"))
products = data if isinstance(data, list) else data.get("products") or data.get("Products") or []
by_category2 = defaultdict(list)
by_category1 = defaultdict(list)
by_group = defaultdict(list)
for product in products:
code = product.get("code") or ""
name = product.get("name") or ""
category2 = product.get("category2Name") or "Uncategorized"
category1 = product.get("categoryName") or "Uncategorized"
group = product.get("group") or "Uncategorized"
entry = {"code": code, "name": name}
by_category2[category2].append(entry)
by_category1[category1].append(entry)
by_group[group].append(entry)
def write_section(title: str, mapping: dict[str, list[dict]], lines: list[str]) -> None:
lines.append(f"## {title}")
lines.append("")
for key in sorted(mapping.keys()):
lines.append(f"### {key} ({len(mapping[key])})")
lines.append("")
lines.append("| 产品Code | 产品名 |")
lines.append("| --- | --- |")
for item in sorted(mapping[key], key=lambda x: x.get("code") or ""):
lines.append(f"| {item.get('code','')} | {item.get('name','')} |")
lines.append("")
md_lines = ["# OpenAPI 产品分类汇总", "", f"产品总数: {len(products)}", ""]
write_section("二级类目 (category2Name)", by_category2, md_lines)
write_section("一级类目 (categoryName)", by_category1, md_lines)
write_section("集团/业务线 (group)", by_group, md_lines)
out_file = base_dir / "products_summary.md"
out_file.write_text("\n".join(md_lines), encoding="utf-8")
json_file = base_dir / "products_summary.json"
json_file.write_text(
json.dumps(
{
"total": len(products),
"by_category2": by_category2,
"by_category1": by_category1,
"by_group": by_group,
},
ensure_ascii=False,
indent=2,
),
encoding="utf-8",
)
print(f"Saved: {out_file}")
print(f"Saved: {json_file}")
if __name__ == "__main__":
main()
```
### scripts/analyze_products_vs_skills.py
```python
#!/usr/bin/env python3
"""Compare merged product list with existing skills."""
from __future__ import annotations
import json
import os
from pathlib import Path
BASE_DIR = Path(os.getenv("OUTPUT_DIR", "output")) / "product-scan"
SKILLS_DIR = Path("skills")
def read_skill_text(path: Path) -> str:
try:
content = path.read_text(encoding="utf-8")
except Exception:
return ""
return content.lower()
def load_skills() -> list[dict]:
skills = []
for skill_file in SKILLS_DIR.rglob("SKILL.md"):
text = read_skill_text(skill_file)
rel = skill_file.relative_to(SKILLS_DIR)
skills.append({"path": str(rel), "text": text})
return skills
def product_matches_skills(product: dict, skills: list[dict]) -> list[str]:
name = (product.get("product_name") or "").strip().lower()
code = (product.get("product_code") or "").strip().lower()
tokens = []
if name and len(name) >= 2:
tokens.append(name)
if code and len(code) >= 2:
tokens.append(code)
matched = []
for skill in skills:
text = skill["text"]
if any(token in text for token in tokens):
matched.append(skill["path"])
return matched
def main() -> None:
merged_file = BASE_DIR / "merged_products.json"
if not merged_file.exists():
raise SystemExit("Missing merged_products.json. Run merge_product_sources.py first.")
merged = json.loads(merged_file.read_text(encoding="utf-8"))
products = merged.get("products") or []
skills = load_skills()
results = []
uncovered = []
for product in products:
matches = product_matches_skills(product, skills)
entry = {
"product_name": product.get("product_name"),
"product_code": product.get("product_code"),
"sources": product.get("sources"),
"matched_skills": matches,
}
results.append(entry)
if not matches:
uncovered.append(entry)
out = {
"total_products": len(products),
"covered": len(products) - len(uncovered),
"uncovered": len(uncovered),
"items": results,
"uncovered_items": uncovered,
}
out_file = BASE_DIR / "skill_gap.json"
out_file.write_text(json.dumps(out, ensure_ascii=False, indent=2), encoding="utf-8")
md_lines = [
"# 产品覆盖分析", "",
f"- 产品总数: {out['total_products']}",
f"- 已覆盖: {out['covered']}",
f"- 未覆盖: {out['uncovered']}",
"",
"## 未覆盖产品", "",
"| 产品名 | 产品Code | 来源 |", "| --- | --- | --- |",
]
for item in uncovered:
name = item.get("product_name") or ""
code = item.get("product_code") or ""
sources = ", ".join(item.get("sources") or [])
md_lines.append(f"| {name} | {code} | {sources} |")
md_file = BASE_DIR / "skill_gap.md"
md_file.write_text("\n".join(md_lines), encoding="utf-8")
print(f"Saved: {out_file}")
print(f"Saved: {md_file}")
if __name__ == "__main__":
main()
```
### references/product-sources.md
```markdown
# Product Source APIs
Keep this file short. Update versions and endpoints from the official docs when they change.
## Ticket System - ListProducts
- Action: `ListProducts`
- Required: endpoint, version, access key, secret
- Optional: `Name`, `Language`
Script: `skills/platform/openapi/alicloud-platform-openapi-product-api-discovery/scripts/products_from_ticket_system.py`
## Support & Service - ListProductByGroup
- Action: `ListProductByGroup`
- Required: `OpenGroupId`, endpoint, version, access key, secret
Script: `skills/platform/openapi/alicloud-platform-openapi-product-api-discovery/scripts/products_from_support_service.py`
## BSS OpenAPI - QueryProductList
- Action: `QueryProductList`
- Endpoint: `business.aliyuncs.com`
- Version: `2017-12-14`
- Params: `PageNum`, `PageSize`, `QueryTotalCount`
Script: `skills/platform/openapi/alicloud-platform-openapi-product-api-discovery/scripts/products_from_bssopenapi.py`
```
### references/openapi-meta.md
```markdown
# OpenAPI Metadata Endpoints
Use these metadata endpoints to fetch product, version, and API information.
## Product list
```text
https://api.aliyun.com/meta/v1/products.json?language=EN_US
```
- `language` can be `EN_US` or `ZH_CN`.
- Response contains product codes and versions.
Script: `skills/platform/openapi/alicloud-platform-openapi-product-api-discovery/scripts/products_from_openapi_meta.py`
## API list per product version
```text
https://api.aliyun.com/meta/v1/products/{ProductCode}/versions/{Version}/api-docs.json
```
- Response contains API names and metadata for the product/version.
Script: `skills/platform/openapi/alicloud-platform-openapi-product-api-discovery/scripts/apis_from_openapi_meta.py`
## API definition (single API)
```text
https://api.aliyun.com/meta/v1/products/{ProductCode}/versions/{Version}/apis/{ApiName}/api.json
```
Use this for deep inspection of request/response schemas.
```
---
## Skill Companion Files
> Additional files collected from the skill directory layout.
### _meta.json
```json
{
"owner": "cinience",
"slug": "alicloud-platform-openapi-product-api-discovery",
"displayName": "Alicloud Platform Openapi Product Api Discovery",
"latest": {
"version": "1.0.2",
"publishedAt": 1773222025206,
"commit": "https://github.com/openclaw/skills/commit/10bb06dbf33d2e52fca050b81b65d7b7fd927f52"
},
"history": [
{
"version": "1.0.1",
"publishedAt": 1770768572500,
"commit": "https://github.com/openclaw/skills/commit/7b19d32002a58fcdd24264add07469636e4178cd"
}
]
}
```