A powerful code indexing tool with multi-platform support
This plugin is not yet in any themed marketplace. To install it, you'll need to add it from GitHub directly.
This plugin uses advanced features that require additional trust:
Only install plugins from repositories you trust. Review the source code before installation.
Choose your preferred installation method below
A marketplace is a collection of plugins. Every plugin gets an auto-generated marketplace JSON for individual installation, plus inclusion in category and themed collections. Add a marketplace once (step 1), then install any plugin from it (step 2).
One-time setup for access to all plugins
When to use: If you plan to install multiple plugins now or later
Step 1: Add the marketplace (one-time)
/plugin marketplace add https://claudepluginhub.com/marketplaces/all.json
Run this once to access all plugins
Step 2: Install this plugin
/plugin install context-please@all
Use this plugin's auto-generated marketplace JSON for individual installation
When to use: If you only want to try this specific plugin
Step 1: Add this plugin's marketplace
/plugin marketplace add https://claudepluginhub.com/marketplaces/plugins/context-please.json
Step 2: Install the plugin
/plugin install context-please@context-please
Note: This is a fork of claude-context by Zilliz, maintained by PleaseAI with additional features and improvements.
Extensions Status: Chrome and VSCode extensions are currently TBD (To Be Determined) and not yet available in this fork.
Context Please is an MCP plugin that adds semantic code search to Claude Code and other AI coding agents, giving them deep context from your entire codebase.
π§ Your Entire Codebase as Context: Claude Context uses semantic search to find all relevant code from millions of lines. No multi-round discovery needed. It brings results straight into the Claude's context.
π° Cost-Effective for Large Codebases: Instead of loading entire directories into Claude for every request, which can be very expensive, Claude Context efficiently stores your codebase in a vector database and only uses related code in context to keep your costs manageable.
Model Context Protocol (MCP) allows you to integrate Claude Context with your favorite AI coding assistants, e.g. Claude Code.
Claude Context needs a vector database. You can sign up on Zilliz Cloud to get an API key.
Copy your Personal Key to replace your-zilliz-cloud-api-key
in the configuration examples.
You need an OpenAI API key for the embedding model. You can get one by signing up at OpenAI.
Your API key will look like this: it always starts with sk-
.
Copy your key and use it in the configuration examples below as your-openai-api-key
.
System Requirements:
Claude Context is not compatible with Node.js 24.0.0, you need downgrade it first if your node version is greater or equal to 24.
Use the command line interface to add the Claude Context MCP server:
claude mcp add context-please \
-e OPENAI_API_KEY=sk-your-openai-api-key \
-e MILVUS_TOKEN=your-zilliz-cloud-api-key \
-- npx @pleaseai/context-please-mcp@latest
See the Claude Code MCP documentation for more details about MCP server management.
Codex CLI uses TOML configuration files:
Create or edit the ~/.codex/config.toml
file.
Add the following configuration:
# IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`.
[mcp_servers.context-please]
command = "npx"
args = ["@pleaseai/context-please-mcp@latest"]
env = { "OPENAI_API_KEY" = "your-openai-api-key", "MILVUS_TOKEN" = "your-zilliz-cloud-api-key" }
# Optional: override the default 10s startup timeout
startup_timeout_ms = 20000
Gemini CLI requires manual configuration through a JSON file:
~/.gemini/settings.json
file.{
"mcpServers": {
"context-please": {
"command": "npx",
"args": ["@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Create or edit the ~/.qwen/settings.json
file and add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Go to: Settings
-> Cursor Settings
-> MCP
-> Add new global MCP server
Pasting the following configuration into your Cursor ~/.cursor/mcp.json
file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json
in your project folder. See Cursor MCP docs for more info.
{
"mcpServers": {
"context-please": {
"command": "npx",
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Go to: Settings
-> MCP
-> Add MCP Server
Add the following configuration to your Void MCP settings:
{
"mcpServers": {
"context-please": {
"command": "npx",
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Add to your Claude Desktop configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:
{
"mcpServers": {
"context-please": {
"command": "npx",
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
The Claude Context MCP server can be used with VS Code through MCP-compatible extensions. Add the following configuration to your VS Code MCP settings:
{
"mcpServers": {
"context-please": {
"command": "npx",
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:
claude-context
STDIO
npx
["@pleaseai/context-please-mcp@latest"]
OPENAI_API_KEY
: your-openai-api-key
MILVUS_ADDRESS
: your-zilliz-cloud-public-endpoint
MILVUS_TOKEN
: your-zilliz-cloud-api-key
Cline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:
Open Cline and click on the MCP Servers icon in the top navigation bar.
Select the Installed tab, then click Advanced MCP Settings.
In the cline_mcp_settings.json
file, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
To configure Claude Context MCP in Augment Code, you can use either the graphical interface or manual configuration.
Click the hamburger menu.
Select Settings.
Navigate to the Tools section.
Click the + Add MCP button.
Enter the following command:
npx @pleaseai/context-please-mcp@latest
Name the MCP: Context Please.
Click the Add button.
mcpServers
array in the augment.advanced
object"augment.advanced": {
"mcpServers": [
{
"name": "context-please",
"command": "npx",
"args": ["-y", "@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
]
}
Roo Code utilizes a JSON configuration file for MCP servers:
Open Roo Code and navigate to Settings β MCP Servers β Edit Global Config.
In the mcp_settings.json
file, add the following configuration:
{
"mcpServers": {
"claude-context": {
"command": "npx",
"args": ["@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
Zencoder offers support for MCP tools and servers in both its JetBrains and VS Code plugin versions.
Tools
Add Custom MCP
Context Please
) and server configuration from below, and make sure to hit the Install
button{
"command": "npx",
"args": ["@pleaseai/context-please-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
Install
button.For LangChain/LangGraph integration examples, see this example.
The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:
npx @pleaseai/context-please-mcp@latest
Open Claude Code
cd your-project-directory
claude
Index your codebase:
Index this codebase
Check indexing status:
Check the indexing status
Start searching:
Find functions that handle user authentication
π That's it! You now have semantic code search in Claude Code.
For more detailed MCP environment variable configuration, see our Environment Variables Guide.
To configure custom embedding models (e.g., text-embedding-3-large
for OpenAI, voyage-code-3
for VoyageAI), see the MCP Configuration Examples for detailed setup instructions for each provider.
For detailed explanation of file inclusion and exclusion rules, and how to customize them, see our File Inclusion & Exclusion Rules.
index_codebase
Index a codebase directory for hybrid search (BM25 + dense vector).
search_code
Search the indexed codebase using natural language queries with hybrid search (BM25 + dense vector).
clear_index
Clear the search index for a specific codebase.
get_indexing_status
Get the current indexing status of a codebase. Shows progress percentage for actively indexing codebases and completion status for indexed codebases.
Our controlled evaluation demonstrates that Claude Context MCP achieves ~40% token reduction under the condition of equivalent retrieval quality. This translates to significant cost and time savings in production environments. This also means that, under the constraint of limited token context length, using Claude Context yields better retrieval and answer results.
For detailed evaluation methodology and results, see the evaluation directory.
Context Please is a monorepo containing main packages:
@pleaseai/context-please-core
: Core indexing engine with embedding and vector database integration@pleaseai/context-please-mcp
: Model Context Protocol server for AI agent integrationWhile MCP is the recommended way to use Claude Context with AI assistants, you can also use it directly or through the VSCode extension.
The @pleaseai/context-please-core
package provides the fundamental functionality for code indexing and semantic search.
import { Context, MilvusVectorDatabase, OpenAIEmbedding } from '@pleaseai/context-please-core';
// Initialize embedding provider
const embedding = new OpenAIEmbedding({
apiKey: process.env.OPENAI_API_KEY || 'your-openai-api-key',
model: 'text-embedding-3-small'
});
// Initialize vector database
const vectorDatabase = new MilvusVectorDatabase({
address: process.env.MILVUS_ADDRESS || 'your-zilliz-cloud-public-endpoint',
token: process.env.MILVUS_TOKEN || 'your-zilliz-cloud-api-key'
});
// Create context instance
const context = new Context({
embedding,
vectorDatabase
});
// Index your codebase with progress tracking
const stats = await context.indexCodebase('./your-project', (progress) => {
console.log(`${progress.phase} - ${progress.percentage}%`);
});
console.log(`Indexed ${stats.indexedFiles} files, ${stats.totalChunks} chunks`);
// Perform semantic search
const results = await context.semanticSearch('./your-project', 'vector database operations', 5);
results.forEach(result => {
console.log(`File: ${result.relativePath}:${result.startLine}-${result.endLine}`);
console.log(`Score: ${(result.score * 100).toFixed(2)}%`);
console.log(`Content: ${result.content.substring(0, 100)}...`);
});
Note: VSCode extension is currently TBD (To Be Determined) and not yet available in this fork. Please use the original claude-context VSCode extension for now.
# Clone repository
git clone https://github.com/pleaseai/context-please.git
cd context-please
# Install dependencies
pnpm install
# Build all packages
pnpm build
# Start development mode
pnpm dev
On Windows, ensure you have:
npm install -g pnpm
# Windows PowerShell/Command Prompt
git clone https://github.com/pleaseai/context-please.git
cd context-please
# Configure git line endings (recommended)
git config core.autocrlf false
# Install dependencies
pnpm install
# Build all packages (uses cross-platform scripts)
pnpm build
# Start development mode
pnpm dev
# Build all packages (cross-platform)
pnpm build
# Build specific package
pnpm build:core
pnpm build:vscode
pnpm build:mcp
# Performance benchmarking
pnpm benchmark
Context Please includes comprehensive integration tests for both the core indexing engine and MCP server.
# Run all tests (unit + integration)
pnpm test
# Run only integration tests (123 tests)
pnpm test:integration
# Run tests for specific package
cd packages/core && pnpm test # All core tests
cd packages/core && pnpm test:integration # Core integration tests only
cd packages/mcp && pnpm test:integration # MCP integration tests only
# Run specific test file
pnpm test packages/core/test/integration/indexing-workflow.integration.test.ts
Test Coverage:
For detailed testing guidelines, see docs/develop/TESTING.md.
# Development with file watching
cd examples/basic-usage
pnpm dev
Check the /examples
directory for complete usage examples:
Common Questions:
β For detailed answers and more troubleshooting tips, see our FAQ Guide.
π§ Encountering issues? Visit our Troubleshooting Guide for step-by-step solutions.
π Need more help? Check out our complete documentation for detailed guides and troubleshooting tips.
We welcome contributions! Please see our Contributing Guide for details on how to get started.
Package-specific contributing guides:
This project is licensed under the MIT License - see the LICENSE file for details.
1.0.0