AI Agents (Markdown API)
Ping documentation is available in raw Markdown format for AI coding assistants like Claude Code, Cursor, Windsurf, and other LLM-powered tools. This enables AI agents to fetch up-to-date documentation directly.
Fetching Documentation
Request markdown from the API endpoint:
curl https://getping.pro/api/docsThis returns the combined markdown documentation.
Claude Code
Add Ping documentation to your Claude Code context:
# Read the docs directly into context
curl -s https://getping.pro/api/docs | claude --readOr add to your project's custom instructions in .claude/CLAUDE.md:
## Ping Integration
When working with Ping webhooks, fetch the latest documentation:
- URL: https://getping.pro/api/docsCursor
Add as a documentation source in Cursor settings:
- Open Cursor Settings > Features > Docs
- Click Add new doc
- Enter URL:
https://getping.pro/api/docs
Windsurf & Other AI Editors
Most AI-powered editors support adding documentation URLs. Configure them to fetch https://getping.pro/api/docs for raw markdown access.
MCP Servers
If your AI tool supports MCP (Model Context Protocol) servers, you can configure a fetch tool to retrieve the documentation:
{
"url": "https://getping.pro/api/docs"
}Why Markdown?
- Smaller payload - Raw markdown is more compact than HTML
- Better context - AI models understand markdown structure natively
- Up-to-date - Always fetches the latest documentation
- Token efficient - No HTML boilerplate consuming context tokens
Using with AI Agents
When building AI agents that need to send notifications:
import requests
# Fetch current Ping docs for context
docs = requests.get('https://getping.pro/api/docs').text
# Pass docs to your LLM for context
# The LLM can now help generate correct Ping webhooksThis is especially useful for:
- Building agents that notify users of progress
- Creating workflows that need human approval
- Developing tools that integrate with Ping