Ping

AI Agents (Markdown API)

Ping documentation is available in raw Markdown format for AI coding assistants like Claude Code, Cursor, Windsurf, and other LLM-powered tools. This enables AI agents to fetch up-to-date documentation directly.


Fetching Documentation

Request markdown from the API endpoint:

Terminal
curl https://getping.pro/api/docs

This returns the combined markdown documentation.

Claude Code

Add Ping documentation to your Claude Code context:

Terminal
# Read the docs directly into context
curl -s https://getping.pro/api/docs | claude --read

Or add to your project's custom instructions in .claude/CLAUDE.md:

## Ping Integration

When working with Ping webhooks, fetch the latest documentation:
- URL: https://getping.pro/api/docs

Cursor

Add as a documentation source in Cursor settings:

  1. Open Cursor Settings > Features > Docs
  2. Click Add new doc
  3. Enter URL: https://getping.pro/api/docs

Windsurf & Other AI Editors

Most AI-powered editors support adding documentation URLs. Configure them to fetch https://getping.pro/api/docs for raw markdown access.

MCP Servers

If your AI tool supports MCP (Model Context Protocol) servers, you can configure a fetch tool to retrieve the documentation:

JSON
{
  "url": "https://getping.pro/api/docs"
}

Why Markdown?

  • Smaller payload - Raw markdown is more compact than HTML
  • Better context - AI models understand markdown structure natively
  • Up-to-date - Always fetches the latest documentation
  • Token efficient - No HTML boilerplate consuming context tokens

Using with AI Agents

When building AI agents that need to send notifications:

import requests

# Fetch current Ping docs for context
docs = requests.get('https://getping.pro/api/docs').text

# Pass docs to your LLM for context
# The LLM can now help generate correct Ping webhooks

This is especially useful for:

  • Building agents that notify users of progress
  • Creating workflows that need human approval
  • Developing tools that integrate with Ping