API Documentation

SpareAPI is compatible with OpenAI, Anthropic, and Google Gemini API formats. Drop in your key and go.

Quick Start

Sign up, grab your API key from the dashboard, then use it with any OpenAI-compatible client.

1. Install the OpenAI SDK

pip install openai

2. Make your first request

from openai import OpenAI

client = OpenAI(
    api_key="sk_live_your_key_here",
    base_url="https://api.spareapi.ai/v1",
)

response = client.chat.completions.create(
    model="claude-sonnet-4-6",
    messages=[
        {"role": "user", "content": "Hello, world!"}
    ],
)

print(response.choices[0].message.content)

Authentication

All API requests require a Bearer token in the Authorization header. Get your API key from the dashboard.

Authorization: Bearer sk_live_your_key_here

Your key is tied to your account balance. Requests fail with 402 Payment Required if the balance is insufficient.

Base URL

https://api.spareapi.ai/v1

Endpoints

POST/v1/chat/completions

OpenAI-compatible chat completions. Works with all models (Claude, GPT, Gemini).

cURL

curl https://api.spareapi.ai/v1/chat/completions \
  -H "Authorization: Bearer sk_live_your_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-6",
    "messages": [
      {"role": "user", "content": "Hello!"}
    ]
  }'
POST/v1/messages

Anthropic-compatible Messages API. Use native Claude format.

cURL

curl https://api.spareapi.ai/v1/messages \
  -H "x-api-key: sk_live_your_key_here" \
  -H "anthropic-version: 2023-06-01" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-6",
    "max_tokens": 1024,
    "messages": [
      {"role": "user", "content": "Hello!"}
    ]
  }'
GET/v1/models

List available models and their pricing. No authentication required.

cURL

curl https://api.spareapi.ai/v1/models

SDK Integrations

SpareAPI works with any OpenAI or Anthropic SDK. Here are the most common ones.

Python — OpenAI SDK

Python

Most popular. Works with all models through OpenAI-compatible format.

from openai import OpenAI

client = OpenAI(
    api_key="sk_live_your_key_here",
    base_url="https://api.spareapi.ai/v1",
)

# Non-streaming
response = client.chat.completions.create(
    model="claude-sonnet-4-6",
    messages=[{"role": "user", "content": "Explain quantum computing"}],
)
print(response.choices[0].message.content)

# Streaming
stream = client.chat.completions.create(
    model="gpt-5",
    messages=[{"role": "user", "content": "Write a poem"}],
    stream=True,
)
for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

TypeScript — OpenAI SDK

TypeScript

Node.js and browser. Works with all models.

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "sk_live_your_key_here",
  baseURL: "https://api.spareapi.ai/v1",
});

const response = await client.chat.completions.create({
  model: "claude-sonnet-4-6",
  messages: [{ role: "user", content: "Hello!" }],
});

console.log(response.choices[0].message.content);

Python — Anthropic SDK

Python

Use the native Anthropic SDK for Claude models.

from anthropic import Anthropic

client = Anthropic(
    api_key="sk_live_your_key_here",
    base_url="https://api.spareapi.ai",
)

message = client.messages.create(
    model="claude-opus-4-6",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Hello, Claude"}
    ],
)

print(message.content[0].text)

Python — Google Gemini SDK

Python

Use the google-genai SDK with OpenAI compatibility mode.

# Install: pip install openai
# Gemini models work through OpenAI-compatible endpoint
from openai import OpenAI

client = OpenAI(
    api_key="sk_live_your_key_here",
    base_url="https://api.spareapi.ai/v1",
)

response = client.chat.completions.create(
    model="gemini-2.5-pro",
    messages=[{"role": "user", "content": "Hello, Gemini"}],
)
print(response.choices[0].message.content)

cURL — Any Language

Direct HTTP for quick testing.

# Chat completion (works with any model)
curl https://api.spareapi.ai/v1/chat/completions \
  -H "Authorization: Bearer sk_live_your_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-5",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

# List available models
curl https://api.spareapi.ai/v1/models

CLI & Editor Integrations

Point your favorite coding assistants at SpareAPI to save 80% on every call.

Claude Code

CLI

Anthropic's official CLI for Claude. Set environment variables to route through SpareAPI.

Add to your shell profile (~/.zshrc, ~/.bashrc):

export ANTHROPIC_BASE_URL="https://api.spareapi.ai"
export ANTHROPIC_AUTH_TOKEN="sk_live_your_key_here"

# Then use Claude Code normally
claude "refactor this function"

Codex CLI

CLI

OpenAI's official Codex CLI. Supports custom providers via config.

Edit ~/.codex/config.toml:

model_provider = "spareapi"
model = "gpt-5"

[model_providers.spareapi]
name = "SpareAPI"
base_url = "https://api.spareapi.ai/v1"
wire_api = "chat"
env_key = "SPAREAPI_KEY"

Then set the key and run:

export SPAREAPI_KEY="sk_live_your_key_here"
codex "fix the bug in auth.ts"

Gemini CLI

CLI

Google's Gemini CLI. Override the endpoint via environment variables.

export GEMINI_API_KEY="sk_live_your_key_here"
export GEMINI_BASE_URL="https://api.spareapi.ai/v1"

gemini "summarize this repo"

OpenCode

CLI

Open-source AI coding assistant. Configure SpareAPI as a custom provider.

Edit ~/.config/opencode/opencode.json:

{
  "provider": {
    "spareapi": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "SpareAPI",
      "options": {
        "baseURL": "https://api.spareapi.ai/v1",
        "apiKey": "sk_live_your_key_here"
      },
      "models": {
        "claude-sonnet-4-6": {},
        "gpt-5": {},
        "gemini-2.5-pro": {}
      }
    }
  }
}

Cursor

Editor

AI-first code editor. Add SpareAPI as a custom OpenAI endpoint.

Open Settings → Models → OpenAI API Key, then:

  1. Check "Override OpenAI Base URL"
  2. Set base URL to: https://api.spareapi.ai/v1
  3. Set API Key to: sk_live_your_key_here
  4. Click "Verify" and save

Continue.dev

Editor

Open-source AI autocomplete for VS Code and JetBrains.

Edit ~/.continue/config.json:

{
  "models": [
    {
      "title": "SpareAPI Claude",
      "provider": "openai",
      "model": "claude-sonnet-4-6",
      "apiBase": "https://api.spareapi.ai/v1",
      "apiKey": "sk_live_your_key_here"
    },
    {
      "title": "SpareAPI GPT-5",
      "provider": "openai",
      "model": "gpt-5",
      "apiBase": "https://api.spareapi.ai/v1",
      "apiKey": "sk_live_your_key_here"
    }
  ]
}

Aider

CLI

AI pair programming in your terminal.

export OPENAI_API_BASE="https://api.spareapi.ai/v1"
export OPENAI_API_KEY="sk_live_your_key_here"

aider --model claude-sonnet-4-6