Back to Blog
Integration Guides

How to Expose a GraphQL API to ChatGPT Using MCPify

GraphQL engineers: make ChatGPT (GPT-5) or Claude query and mutate your GraphQL backend safely—without custom plugins or brittle glue code. Complete step-by-step guide.

Herman Sjøberg
Herman Sjøberg
AI Integration Expert
August 23, 202510 min read
GraphQLChatGPTGPT-5ClaudeAPI IntegrationMCP

Key Takeaways

  • Expose GraphQL endpoints to AI models with zero code
  • Support for SDL and introspection schemas
  • Automatic field-level selection to minimize tokens
  • Type-safe queries and mutations
  • Works with ChatGPT, Claude, and any MCP-compatible AI

How to Expose a GraphQL API to ChatGPT Using MCPify

Who this is for: Backend/GraphQL engineers who want ChatGPT (GPT‑5) or Claude to query and mutate their GraphQL backend safely — without writing custom plugins or brittle glue code.

This step‑by‑step tutorial shows how to take a GraphQL endpoint and make it AI‑ready using MCPify. You’ll provide your schema (SDL or introspection), let MCPify generate typed tools the LLM can call, and then connect those tools to ChatGPT/GPT‑5 or Claude via the Model Context Protocol (MCP).


What you’ll build

We’ll MCPify a fictional ProjectBoard GraphQL API so an LLM can:

  • Query tasks with filters (tasks(status, limit))
  • Fetch a task by ID (task(id))
  • Create a task via mutation (createTask(input))
  • Only fetch exact fields needed to minimize tokens

At the end, you’ll have a shareable MCP endpoint in MCPify and a minimal client snippet that lets GPT‑5 use your GraphQL as a tool.


Prerequisites

  • A GraphQL endpoint (e.g., https://api.projectboard.local/graphql)
  • Schema available as SDL (schema.graphql) or introspection JSON
  • Credentials if required (API key, OAuth token, headers)
  • MCPify account: https://mcpify.org/auth/register
  • (Optional) OpenAI/Anthropic access to test from GPT‑5/Claude

Step 1 — Export your GraphQL schema (SDL or Introspection)

MCPify can ingest SDL (schema.graphql) or introspection JSON.

Option A: SDL snippet (schema.graphql)

schema {
  query: Query
  mutation: Mutation
}

type Query {
  tasks(status: TaskStatus, limit: Int = 10): [Task!]!
  task(id: ID!): Task
}

type Mutation {
  createTask(input: CreateTaskInput!): Task!
}

enum TaskStatus { OPEN IN_PROGRESS DONE }

type Task {
  id: ID!
  title: String!
  status: TaskStatus!
  due: String
  assignee: User
  createdAt: String!
}

type User { id: ID!, name: String! }

input CreateTaskInput {
  title: String!
  assigneeId: ID
  due: String
}

Option B: Introspection JSON

Use a tool you prefer to download introspection:

  • npx get-graphql-schema https://api.projectboard.local/graphql > schema.graphql (SDL)
  • npx graphql-inspector introspect https://api.projectboard.local/graphql > schema.json (JSON)
  • Apollo CLI: apollo client:download-schema (varies by setup)

Tip: For private endpoints, add --header "Authorization: Bearer <TOKEN>" as needed.


Step 2 — Describe the endpoint & auth (MCPify config)

You can point MCPify at your GraphQL endpoint and provide schema + auth in a small JSON config.

{
  "service": "projectboard-graphql",
  "type": "graphql",
  "endpoint": "https://api.projectboard.local/graphql",
  "headers": {
    "X-Org-ID": "acme"
  },
  "auth": {
    "type": "bearer",
    "token": "{{PROJECTBOARD_TOKEN}}"
  },
  "schema": {
    "sdl": "<<< paste the SDL from schema.graphql here >>>"
    /* or: "introspection": { ... } */
  },
  "operations": [
    {
      "name": "ListTasks",
      "query": "query ListTasks($status: TaskStatus, $limit: Int = 10) { tasks(status: $status, limit: $limit) { id title status due assignee { id name } } }"
    },
    {
      "name": "GetTask",
      "query": "query GetTask($id: ID!) { task(id: $id) { id title status due assignee { name } } }"
    },
    {
      "name": "CreateTask",
      "mutation": "mutation CreateTask($input: CreateTaskInput!) { createTask(input: $input) { id title status } }"
    }
  ]
}

Why declare operations? MCPify will still expose ad‑hoc GraphQL capability from the schema, but declaring common operations gives the LLM named, documented tools with examples — boosting first‑call success and reducing prompt tokens.


Step 3 — Upload to MCPify

  1. Sign in: https://mcpify.org/auth/register
  2. New ServiceAdd GraphQL (from SDL or Introspection)
  3. Paste/upload your schema and config; Deploy
  4. Configure Auth (API key, Bearer token, OAuth) in the service’s Credentials tab

MCPify generates an MCP‑compliant toolset for each declared operation and exposes generic GraphQL query/mutation capabilities with strict typing from your schema.


Step 4 — Inspect the generated tools (typed & documented)

In Tools, you’ll see entries like:

  • projectboard.ListTasks(status?, limit?) → [Task]
  • projectboard.GetTask(id: ID!) → Task
  • projectboard.CreateTask(input: CreateTaskInput!) → Task
  • projectboard.graphql.query(document, variables?)
  • projectboard.graphql.mutation(document, variables?)

Each tool includes:

  • Human‑readable description and examples
  • Typed variables (enums, non‑null, defaults)
  • Field‑level selection guidance to keep responses small
  • Pagination hints (if your schema exposes cursors/limits)

How MCPify helps LLMs: It uses the schema to generate rich metadata (types, args, shapes). The model knows exactly what variables are allowed and which fields are returned, so it can craft precise queries and avoid trial‑and‑error.


Step 5 — Connect from ChatGPT/GPT‑5 (minimal example)

Below is a minimal Python example registering your MCPify service as a tool. Replace the URL with your Service MCP Endpoint from the MCPify dashboard.

from openai import OpenAI

client = OpenAI()

resp = client.chat.completions.create(
    model="gpt-4.1",
    messages=[
        {"role": "user", "content": "List 5 OPEN tasks assigned to Alex with due dates."}
    ],
    tools=[
        {
            "type": "mcp",
            "name": "projectboard",
            "server_url": "<YOUR_MCPIFY_SERVICE_URL>"
        }
    ]
)

print(resp.choices[0].message)

Also works with Claude (MCP servers are first‑class there) and any agent framework that supports MCP tools.


Step 6 — Try helpful prompts

  • “Query ProjectBoard for my 10 most urgent OPEN tasks; include title, assignee.name, due.”
  • “Create a task ‘Ship v1.4 release notes’ assigned to user ID u_42, due next Friday.”
  • “Fetch task T-9321 and summarize blockers in one paragraph.”
  • “Show tasks by status with counts; only fetch id and status to keep tokens low.”

You’ll see the LLM compose a typed query/mutation, select minimal fields, and pass variables correctly — guided by MCPify’s metadata.


Performance & cost tips (GraphQL‑specific)

  • Select only fields you need. GraphQL makes this easy; smaller responses = fewer tokens.
  • Use fragments. Reuse common field sets to keep queries short and consistent.
  • Prefer variables over inline args. Clearer, more cache‑friendly, easier to retry.
  • Page through large lists. Expose first/after or limit/offset and let the model iterate.
  • Cache high‑read queries. MCPify’s response cache cuts latency and token spend.
  • Validate enums. Enforce TaskStatus etc. in tool schemas so the model won’t guess strings.

Troubleshooting

  • 401/403: Check Bearer/API‑Key, headers, and scopes in MCPify Credentials.
  • GraphQL errors: Inspect the errors array; verify variable names/types match schema.
  • Timeouts/large payloads: Reduce field selection, paginate, or add filters.
  • CORS/local dev: When testing locally, use server‑side calls or a proxy; MCPify itself calls your backend server‑to‑server.

Where to go next


Sources

Who This Article Is For

Backend and GraphQL engineers wanting to integrate GraphQL APIs with AI assistants

About the Author

Herman Sjøberg

Herman Sjøberg

AI Integration Expert

Herman excels at assisting businesses in generating value through AI adoption. With expertise in cloud architecture (Azure Solutions Architect Expert), DevOps, and machine learning, he's passionate about making AI integration accessible to everyone through MCPify.

Connect on LinkedIn