Build a Powerful MCP Server | Supercharge Your C++ AI Coding Agent


MCP Server — though it may sound highly technical — is quietly changing how we interact with AI.
Think back to your first AI coding assistant: it could write elegant code and explain complex ideas, but when you asked, “Can this code actually run?” the answer was always, “Please compile and test it yourself.”

MCP Server bridges that gap. It transforms AI from a static knowledge source into an active partner that can take real action — compiling code, running tests, analyzing data, or even fetching information from the web.
It’s not just a protocol; it’s the link that lets AI truly engage with the real world.

MCP Server

What is an MCP Server?

An MCP Server is a service built on the Model Context Protocol (MCP) that allows AI assistants — like ChatGPT, Claude, or Gemini — to safely interact with external tools, local files, and your system environment.

In simple terms, it acts as a bridge between AI and the real world.
Instead of being limited to generating text based on its training data, AI can now:

  • Read and modify project files
  • Execute system commands or compile code
  • Query databases or APIs
  • Run tests and analyze results

Essentially, an MCP Server turns your AI assistant from a static “knowledge library” into a hands-on coding or automation partner that can actively participate in your workflow.

Core Operation Model of MCP Server

The core operation of an MCP Server is essentially a bridge between AI and external tools. Its main workflow can be divided into the following steps:

  • Tool Registration
  • Transport Layer
  • Request Handling
  • Tool Execution
  • Response Delivery
  • Security and Permission Control (Optional)

Why C++ Developers Need an MCP Server?

For C++ developers, the benefits of an MCP Server are clear:

FeatureTraditional AI AssistantWith MCP Server
Read project files❌ Must manually copy and paste code✅ Automatically reads specified folders
Compile & build❌ Cannot execute commands✅ Can call g++ or cmake
Unit testing❌ Cannot run tests✅ Runs tests directly and reports results
Error analysis❌ Only syntax-based guesses✅ Analyzes actual compiler error messages
Auto-fix code❌ Passive suggestions only✅ Actively modifies code and recompiles

Development Environment

Before setting up your own MCP Server, you need to prepare a local C++ compiler (such as g++ or cmake, and make sure it’s in your system PATH) and a development tool. We recommend using VS Code with the Continue extension, which provides syntax completion, code navigation, and editing convenience.

The basic steps to create an MCP Server project are as follows:

Install Node.js

Node.js is the core runtime environment for MCP Server. It’s recommended to install the LTS (Long Term Support) version to ensure stability.

Official download: Node.js

After installing Node.js, check the installation by running in the terminal:

node -v
npm -v
npx -v

If all three commands return version numbers (for example, v18.x.x, v10.x.x, etc.), Node.js, npm, and npx have been installed successfully.

  • Create a Project Folder

It’s recommended to create a separate folder for your project, for example:

mkdir cpp-mcp
cd cpp-mcp

This helps organize your code and dependencies.

  • Initialize a Node Project
npm init -y

This command generates a package.json file, which manages your project’s dependencies, versions, and scripts.

  • Install the MCP SDK
npm install @modelcontextprotocol/sdk

This installs the official MCP Server SDK, which allows you to build tools and transports so your AI assistant can interact with your environment.

Getting Started: Building Your C++ MCP Server

Create server.js in your project directory:

import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { CallToolRequestSchema, ListToolsRequestSchema } from "@modelcontextprotocol/sdk/types.js";
import fs from "fs";
import path from "path";
import { execSync } from "child_process";

console.error("🚀 Starting C++ MCP Server...");

// Define available tools
const tools = [
  {
    name: "write_file_proxy",
    description: "Write content to file",
    inputSchema: {
      type: "object",
      properties: {
        path: { type: "string" },
        content: { type: "string" },
        mode: { 
          type: "string", 
          enum: ["overwrite", "append", "error"],
          default: "overwrite" 
        },
      },
      required: ["path", "content"],
    },
  },
  {
    name: "compile_and_run",
    description: "Compile and run a C++ file",
    inputSchema: {
      type: "object",
      properties: {
        path: { type: "string" },
      },
      required: ["path"],
    },
  }
];

// Create MCP server instance
const server = new Server(
  {
    name: "cpp-ai-proxy",
    version: "1.0.0",
  },
  {
    capabilities: {
      tools: {},
    },
  }
);

// Handle tool listing requests
server.setRequestHandler(ListToolsRequestSchema, async () => {
  console.error("📋 Listing tools...");
  return {
    tools: tools
  };
});

// Handle tool execution requests
server.setRequestHandler(CallToolRequestSchema, async (request) => {
  console.error(`🔧 Tool called: ${request.params.name}`);
  const { name, arguments: args } = request.params;
  
  if (name === "write_file_proxy") {
    const { path: filePath, content, mode = "overwrite" } = args;
    const absPath = path.resolve(filePath);
    
    try {
      // Check if file exists and handle based on mode
      if (fs.existsSync(absPath) && mode === "error") {
        return {
          content: [{ type: "text", text: `❌ File exists: ${absPath}` }]
        };
      }
      
      // Create directory if it doesn't exist
      const dir = path.dirname(absPath);
      if (!fs.existsSync(dir)) {
        fs.mkdirSync(dir, { recursive: true });
      }
      
      // Write or append content based on mode
      if (fs.existsSync(absPath) && mode === "append") {
        fs.appendFileSync(absPath, content, "utf8");
      } else {
        fs.writeFileSync(absPath, content, "utf8");
      }
      
      return {
        content: [{ type: "text", text: `✅ File written: ${absPath}` }]
      };
    } catch (error) {
      return {
        content: [{ type: "text", text: `❌ Error: ${error.message}` }]
      };
    }
  }
  
  if (name === "compile_and_run") {
    const { path: filePath } = args;
    const absPath = path.resolve(filePath);
    
    try {
      // Check if source file exists
      if (!fs.existsSync(absPath)) {
        return {
          content: [{ type: "text", text: `❌ File not found: ${absPath}` }]
        };
      }
      
      // Compile C++ code
      const exePath = absPath.replace(/\.cpp$/, "");
      console.error(`🛠️ Compiling: ${absPath}`);
      execSync(`g++ "${absPath}" -o "${exePath}"`, { stdio: 'pipe' });
      
      // Execute the compiled program
      console.error(`🚀 Running: ${exePath}`);
      const output = execSync(`"${exePath}"`, { encoding: "utf8", stdio: 'pipe' });
      
      return {
        content: [{ type: "text", text: `✅ Output:\n${output}` }]
      };
    } catch (error) {
      // Extract meaningful error information
      const errorOutput = error.stderr ? error.stderr.toString() : error.message;
      return {
        content: [{ type: "text", text: `❌ Compilation or execution failed:\n${errorOutput}` }]
      };
    }
  }
  
  // Handle unknown tools
  throw new Error(`Unknown tool: ${name}`);
});

// Global error handler for the server
server.onerror = (error) => {
  console.error('❌ Server error:', error);
};

// Connection close handler
server.onclose = () => {
  console.error('🔌 Server connection closed');
};

// Main server startup function
async function main() {
  try {
    console.error("📡 Connecting to transport...");
    const transport = new StdioServerTransport();
    await server.connect(transport);
    console.error("✅ C++ MCP Server running and connected!");
  } catch (error) {
    console.error('💥 Failed to start server:', error);
    process.exit(1);
  }
}

// Start the server
main().catch(console.error);

Configuring VS Code with the Continue Extension

In your project directory (e.g., cpp-mcp/.continue/mcpServers), create a new file called new-mcp-server.yaml:

name: New MCP server
version: 0.0.1
schema: v1
mcpServers:
  - name: New MCP server
    command: npx
    args:
      - -y
      - node
      - Your/cpp-mcp/server.js

Project Structure

C++ MCP Server project structure:

cpp-mcp/
├── .continue/          # Continue configuration directory
├── node_modules/       # Node.js dependencies (auto-generated)
├── package.json        # Project configuration and definitions
├── package-lock.json   # Dependency version lock file
├── server.js           # Main MCP Server script
├── test                # Compiled executable files
└── test.cpp            # C++ test source file

Testing the C++ MCP Server

Start the server:

node server.js
  • Testing in VS Code:

Open the Continue chat panel and try the following command:

請幫我創建一個簡單的 Hello World C++ 程序,然後編譯並運行它
  • The AI will:
1. Automatically create a C++ source file
2. Use the compile_cpp tool to compile the code
3. Run the program using the run_executable tool
4. Return the complete execution output

Explaining server.js

server.js acts as a bridge that allows the AI to directly manipulate files and execute C++ code.

It has two main functionalities:

  • File Writing Tool
write_file_proxy({
    path: "test.cpp",
    content: "C++ code",
    mode: "overwrite" // overwrite / append / error
})

Creates or modifies files and automatically creates directories if they don’t exist.

  • Compile and Run Tool
compile_and_run({
    path: "test.cpp"
})

Automatically compiles the C++ file,runs the executable and returns the output.

  • Example AI Usage
"Please create a file hello.cpp with a Hello World program"
→ Calls the write_file_proxy tool

"Please compile and run hello.cpp"
→ Calls the compile_and_run tool

Receives the result: "✅ Output: Hello World!"

Conclusion

The advent of the MCP Server transforms AI from a passive code generator into an intelligent partner that truly participates in the development workflow. For C++ developers, it not only understands code semantics but can also directly operate compilers, run tests, and analyze errors—turning development from conversation into concrete action.

By building your own MCP Server and integrating it with VS Code’s Continue extension, your AI assistant can directly interact with your project environment. From compilation and debugging to error analysis, the entire workflow becomes automated and responsive, delivering a smoother, more realistic engineering experience.

And this is just the beginning. In the future, you can extend your MCP Server with additional tools—such as unit testing, static analysis, or performance monitoring—making AI a core contributor to your development team, a tireless C++ companion always ready to help.ur development team, a tireless C++ companion that’s always ready to help.