.do
Mcp

Src MCP

Model Context Protocol reference for src.do - Source code management and hosting

Src MCP

Source code management and hosting

Overview

The Model Context Protocol (MCP) provides AI models with direct access to src.do through a standardized interface.

Installation

pnpm add @modelcontextprotocol/sdk

Configuration

Add to your MCP server configuration:

{
  "mcpServers": {
    "src": {
      "command": "npx",
      "args": ["-y", "@dotdo/mcp-server"],
      "env": {
        "DO_API_KEY": "your-api-key"
      }
    }
  }
}

Tools

src/invoke

Main tool for src.do operations.

{
  "name": "src/invoke",
  "description": "Source code management and hosting",
  "inputSchema": {
    "type": "object",
    "properties": {
      "operation": {
        "type": "string",
        "description": "Operation to perform"
      },
      "parameters": {
        "type": "object",
        "description": "Operation parameters"
      }
    },
    "required": ["operation"]
  }
}

Usage in AI Models

Claude Desktop

// ~/Library/Application Support/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "src": {
      "command": "npx",
      "args": ["-y", "@dotdo/mcp-server", "--tool=src"],
      "env": {
        "DO_API_KEY": "undefined"
      }
    }
  }
}

OpenAI GPTs

# Custom GPT configuration
tools:
  - type: mcp
    server: src
    operations:
      - invoke
      - query
      - execute

Custom Integration

import { Client } from '@modelcontextprotocol/sdk/client/index.js'
import { StdioClientTransport } from '@modelcontextprotocol/sdk/client/stdio.js'

const transport = new StdioClientTransport({
  command: 'npx',
  args: ['-y', '@dotdo/mcp-server', '--tool=src'],
})

const client = new Client(
  {
    name: 'src-client',
    version: '1.0.0',
  },
  {
    capabilities: {},
  }
)

await client.connect(transport)

// Call tool
const result = await client.callTool({
  name: 'src/invoke',
  arguments: {
    operation: 'src',
    parameters: {},
  },
})

Tool Definitions

Available Tools

{
  "tools": [
    {
      "name": "src/invoke",
      "description": "Invoke src.do",
      "inputSchema": {
        /* ... */
      }
    },
    {
      "name": "src/query",
      "description": "Query src.do resources",
      "inputSchema": {
        /* ... */
      }
    },
    {
      "name": "src/status",
      "description": "Check src.do status",
      "inputSchema": {
        /* ... */
      }
    }
  ]
}

Resources

Available Resources

{
  "resources": [
    {
      "uri": "src://config",
      "name": "Src Configuration",
      "mimeType": "application/json"
    },
    {
      "uri": "src://docs",
      "name": "Src Documentation",
      "mimeType": "text/markdown"
    }
  ]
}

Prompts

Pre-configured Prompts

{
  "prompts": [
    {
      "name": "src-quick-start",
      "description": "Quick start guide for src.do",
      "arguments": []
    },
    {
      "name": "src-best-practices",
      "description": "Best practices for src.do",
      "arguments": []
    }
  ]
}

Examples

Basic Usage

// AI model calls tool via MCP
mcp call src/run

With Parameters

// Call with parameters
await mcp.callTool('src/invoke', {
  operation: 'process',
  parameters: {
    // Operation-specific parameters
  },
  options: {
    timeout: 30000,
  },
})

Error Handling

try {
  const result = await mcp.callTool('src/invoke', {
    operation: 'process',
  })
  return result
} catch (error) {
  if (error.code === 'TOOL_NOT_FOUND') {
    console.error('Src tool not available')
  } else {
    throw error
  }
}

AI Integration Patterns

Agentic Workflows

// AI agent uses src.do in workflow
const workflow = {
  steps: [
    {
      tool: 'src/invoke',
      operation: 'analyze',
      input: 'user-data',
    },
    {
      tool: 'src/process',
      operation: 'transform',
      input: 'analysis-result',
    },
  ],
}

Chain of Thought

AI models can reason about src.do operations:

User: "I need to process this data"

AI: "I'll use the src tool to:
1. Validate the data format
2. Process it through src.do
3. Return the results

Let me start..."

[Calls: mcp call src/run]

Server Implementation

Custom MCP Server

import { Server } from '@modelcontextprotocol/sdk/server/index.js'
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js'

const server = new Server(
  {
    name: 'src-server',
    version: '1.0.0',
  },
  {
    capabilities: {
      tools: {},
      resources: {},
      prompts: {},
    },
  }
)

// Register tool
server.setRequestHandler('tools/call', async (request) => {
  if (request.params.name === 'src/invoke') {
    // Handle src.do operation
    return {
      content: [
        {
          type: 'text',
          text: JSON.stringify(result),
        },
      ],
    }
  }
})

const transport = new StdioServerTransport()
await server.connect(transport)

Best Practices

  1. Tool Design - Keep tools focused and single-purpose
  2. Error Messages - Provide clear, actionable errors
  3. Documentation - Include examples in tool descriptions
  4. Rate Limiting - Implement appropriate limits
  5. Security - Validate all inputs from AI models
  6. Monitoring - Track tool usage and errors