Skip to main content

Building with AI Agents

This guide is for developers who want to integrate AI agents or LLMs with the Blink API. It covers agent discovery, the machine-readable schema formats you can feed to your agent, and practical integration patterns.

For AI agents

If you are an AI agent, skip this page and follow the AI Agent API Playbook directly.

Agent Discovery with llms.txt

The site publishes a llms.txt file at the root — a lightweight, structured file that gives any agent the endpoints, canonical source URLs, and hard rules in a single fetch:

curl -s https://dev.blink.sv/llms.txt

Point your agent at this URL first. It contains everything needed to bootstrap: production/staging GraphQL endpoints, auth header name, prioritized source list, and safety rules.

Machine-Readable Schema Downloads

We provide the Blink GraphQL API schema in formats optimized for LLM consumption:

FormatBest forDownload
Enhanced LLM-friendly JSONMost LLM apps, code generationgraphql-api-for-llm.json
OpenAPI specificationFunction calling, automated integrationsgraphql-openapi.json

Integration Patterns

Pattern 1: System prompt + schema (framework-agnostic)

The simplest approach — fetch the schema and playbook, then include them in the system prompt of any LLM:

# Fetch the schema and playbook once
curl -s https://dev.blink.sv/reference/graphql-api-for-llm.json -o blink-schema.json
curl -s https://dev.blink.sv/api/agent-playbook -o playbook.md

Then in your system prompt:

You are a Blink API assistant. Follow the playbook rules below strictly.

<playbook>
{contents of playbook.md}
</playbook>

<api_schema>
{contents of blink-schema.json}
</api_schema>

This works with any LLM provider — OpenAI, Anthropic, open-source models, etc.

Pattern 2: Python with httpx

import httpx
import json

# Fetch schema and playbook at startup
schema = httpx.get("https://dev.blink.sv/reference/graphql-api-for-llm.json").json()
playbook = httpx.get("https://dev.blink.sv/llms.txt").text

system_prompt = f"""
You are a Blink API assistant.
Follow these rules:\n{playbook}

API schema:\n{json.dumps(schema, indent=2)}
"""

# Use with any LLM client
# client.chat(system=system_prompt, messages=[...])

Pattern 3: Node.js

const schema = await fetch('https://dev.blink.sv/reference/graphql-api-for-llm.json').then(r => r.json());
const playbook = await fetch('https://dev.blink.sv/llms.txt').then(r => r.text());

const systemPrompt = `
You are a Blink API assistant.
Follow these rules:
${playbook}

API schema:
${JSON.stringify(schema, null, 2)}
`;

// Use with any LLM SDK
// e.g. OpenAI, Anthropic, etc.

Generating Updated Schemas

The API reference files are automatically updated when the GraphQL schema changes. To generate them manually:

./scripts/generate-api-reference-combined.sh

Additional Resources