Overview
Integrate CodeAlive with Continue - the first client to offer full support for all MCP features (Resources, Prompts, Tools, and Sampling). Continue is an open-source platform that lets you build custom AI code agents with any model, without vendor lock-in.Prerequisites
- Continue extension installed in VS Code or JetBrains IDE
- CodeAlive account with API key (Sign up here)
- Indexed repositories in your CodeAlive dashboard
Configuring Continue with CodeAlive
Open Continue Config
Access Continue configuration:VS Code:
- Click Continue icon in sidebar
- Click gear icon → “Open config.json”
- Open Continue panel
- Settings → Configuration
Add CodeAlive MCP
Continue has full MCP support. Update your
~/.continue/config.json:Continue maps MCP features to its existing capabilities:
- Resources → Context sources
- Prompts → Slash commands
- Tools → Tool integrations
- Sampling → Model configurations
Configuration Options
YAML Configuration
Continue also supports YAML configuration in~/.continue/config.yaml:
Docker Setup
For local deployment:Usage Patterns
- Context Provider
- Slash Commands
- Auto-Context
Use
@codealive to add context to your prompts:Advanced Features
Custom Models with CodeAlive
Configure Continue to use CodeAlive with different models:Repository Filtering
Limit CodeAlive searches to specific repositories:Custom Prompts
Create templates that leverage CodeAlive:IDE-Specific Setup
VS Code
VS Code
Additional VS Code settings:
JetBrains
JetBrains
JetBrains-specific configuration:
Vim/Neovim
Vim/Neovim
For Neovim with Continue:
Troubleshooting
@codealive not recognized
@codealive not recognized
Solutions:
- Check config.json syntax is valid
- Restart IDE completely
- Verify contextProviders section exists
- Check Continue logs for errors
No context returned
No context returned
Solutions:
- Verify API key is valid
- Check repositories are indexed
- Test MCP server URL directly
- Review Continue debug output
Slow performance
Slow performance
Solutions:
- Use more specific queries
- Limit repository scope
- Consider Docker deployment
- Check network latency
Best Practices
Context Usage
Use @codealive for large context needs, reducing token usage
Model Selection
Pair faster models with CodeAlive for better performance
Caching
Enable Continue’s cache for repeated queries
Privacy
Use Docker or self-hosted for sensitive code