Skip to content

MCP Server

Pipe2.ai exposes a Model Context Protocol (MCP) server that lets AI assistants — like Claude Desktop, Cursor, ChatGPT, and others — interact with your pipelines directly through natural language.

What Can You Do?

With the Pipe2.ai MCP server, AI assistants can:

Run Pipelines

Generate AI videos and images by describing what you want in natural language. The assistant handles file uploads, pipeline selection, and parameter configuration.

Check Status

Monitor pipeline runs with real-time status updates. Get notified when your generation completes or if something goes wrong.

Manage Assets

Browse, filter, and retrieve your generated assets — images, videos, and audio files — all through conversation.

Track Credits

Check your credit balance and transaction history without leaving your AI assistant.

Architecture

The MCP server runs as a dedicated service and communicates with Pipe2.ai’s backend:

AI Assistant (Claude, Cursor, etc.)
↓ MCP Protocol (HTTP or SSE)
Pipe2.ai MCP Server
Pipe2.ai Backend API

Transport modes:

  • Streamable HTTP (/mcp) — primary transport, recommended for most clients
  • SSE (/sse/) — legacy transport for older MCP clients
  • Stdio (--stdio flag) — for CLI integrations like claude mcp add

Key Features

  • Dynamic pipeline tools — one tool per active pipeline, auto-generated from the database schema
  • Real-time notifications — WebSocket subscriptions push pipeline status updates to your assistant
  • OAuth 2.1 authentication — secure access with personal access tokens as an alternative

Next Steps