Skip to main content

Overview

Integrate CodeAlive with Continue - the first client to offer full support for all MCP features (Resources, Prompts, Tools, and Sampling). Continue is an open-source platform that lets you build custom AI code agents with any model, without vendor lock-in.
Quick install: Run npx @codealive/installer to automatically configure CodeAlive for Continue. See the Installation Guide for details.

Prerequisites

  • Continue extension installed in VS Code or JetBrains IDE
  • CodeAlive account with API key (Sign up here)
  • Indexed repositories in your CodeAlive dashboard

Configuring Continue with CodeAlive

1

Install Continue

Follow the installation steps above for your IDE
2

Open Continue Config

Access Continue configuration:VS Code:
  • Click Continue icon in sidebar
  • Click gear icon → “Open config.json”
JetBrains:
  • Open Continue panel
  • Settings → Configuration
3

Add CodeAlive MCP

Continue has full MCP support. Update your ~/.continue/config.json:
{
  "models": [
    // Your existing models (OpenAI, Anthropic, Ollama, etc.)
  ],
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api/",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
Continue maps MCP features to its existing capabilities:
  • Resources → Context sources
  • Prompts → Slash commands
  • Tools → Tool integrations
  • Sampling → Model configurations
Replace YOUR_API_KEY_HERE with your actual CodeAlive API key
4

Restart IDE

Restart your IDE to load the new configuration
5

Verify Integration

Type @codealive in Continue chat to test:
  • “@codealive what repositories are available?”
  • “@codealive find authentication code”

Configuration Options

YAML Configuration

Continue also supports YAML configuration in ~/.continue/config.yaml:
models:
  - model: gpt-4
    provider: openai
  - model: claude-3-opus
    provider: anthropic

contextProviders:
  - name: codealive
    provider: mcp
    config:
      serverUrl: https://mcp.codealive.ai/api/
      headers:
        Authorization: Bearer YOUR_API_KEY_HERE

mcpServers:
  - name: CodeAlive
    type: streamable-http
    url: https://mcp.codealive.ai/api
    requestOptions:
      headers:
        Authorization: "Bearer YOUR_API_KEY_HERE"

Docker Setup

For local deployment:
# Run CodeAlive MCP locally
docker run -d \
  -p 8000:8000 \
  -e CODEALIVE_API_KEY=YOUR_API_KEY \
  --name codealive-mcp \
  ghcr.io/codealive-ai/codealive-mcp:main
Update Continue config to use local server:
{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "http://localhost:8000/api"
    }
  }
}

Usage Patterns

Use @codealive to add context to your prompts:
@codealive find the user authentication flow
Then explain how to add OAuth support

Advanced Features

Custom Models with CodeAlive

Configure Continue to use CodeAlive with different models:
{
  "models": [
    {
      "title": "GPT-4 with CodeAlive",
      "model": "gpt-4",
      "provider": "openai",
      "contextProviders": ["codealive"]
    },
    {
      "title": "Claude with CodeAlive",
      "model": "claude-3-opus",
      "provider": "anthropic",
      "contextProviders": ["codealive"]
    },
    {
      "title": "Local Ollama with CodeAlive",
      "model": "codellama",
      "provider": "ollama",
      "contextProviders": ["codealive"]
    }
  ]
}

Repository Filtering

Limit CodeAlive searches to specific repositories:
{
  "contextProviders": [
    {
      "name": "codealive-backend",
      "provider": "mcp",
      "config": {
        "serverUrl": "https://mcp.codealive.ai/api/",
        "headers": {
          "Authorization": "Bearer YOUR_API_KEY"
        },
        "filters": {
          "repositories": ["backend", "api"]
        }
      }
    }
  ]
}

Custom Prompts

Create templates that leverage CodeAlive:
{
  "customPrompts": [
    {
      "name": "Code Review",
      "prompt": "@codealive find similar code to:\n\n{{{input}}}\n\nReview for best practices and suggest improvements"
    },
    {
      "name": "Test Generation",
      "prompt": "@codealive analyze:\n\n{{{input}}}\n\nGenerate comprehensive unit tests"
    }
  ]
}

IDE-Specific Setup

Additional VS Code settings:
{
  "continue.telemetry": false,
  "continue.enableTabAutocomplete": true,
  "continue.contextProviders.codealive.enabled": true
}
JetBrains-specific configuration:
{
  "jetbrains": {
    "enabled": true,
    "port": 65432
  },
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api/",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}
For Neovim with Continue:
-- In your Neovim config
require('continue').setup({
  mcp_servers = {
    codealive = {
      type = 'http',
      url = 'https://mcp.codealive.ai/api/',
      headers = {
        Authorization = 'Bearer YOUR_API_KEY'
      }
    }
  }
})

Troubleshooting

Solutions:
  1. Check config.json syntax is valid
  2. Restart IDE completely
  3. Verify contextProviders section exists
  4. Check Continue logs for errors
Solutions:
  1. Verify API key is valid
  2. Check repositories are indexed
  3. Test MCP server URL directly
  4. Review Continue debug output
Solutions:
  1. Use more specific queries
  2. Limit repository scope
  3. Consider Docker deployment
  4. Check network latency
For more solutions, see the Troubleshooting Guide.

Best Practices

Context Usage

Use @codealive for large context needs, reducing token usage

Model Selection

Pair faster models with CodeAlive for better performance

Caching

Enable Continue’s cache for repeated queries

Privacy

Use Docker or self-hosted for sensitive code