Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .changeset/moody-terms-teach.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
'@srcbook/components': patch
'@srcbook/api': patch
'@srcbook/web': patch
'srcbook': patch
---

Srcbook has an MCP client that can use the sequential-reasoning MCP server.
8 changes: 8 additions & 0 deletions .changeset/violet-gorillas-attack.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
'@srcbook/components': patch
'@srcbook/api': patch
'@srcbook/web': patch
'srcbook': patch
---

This change gives Srcbook an MCP client, giving Srcbook the capability to connect to a wide array of MCP servers that provide the LLM with tools and resources. For now, this only runs in local dev: local production/Docker will be addressed in a follow-up PR.
4 changes: 4 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
node_modules
.turbo
.pnpm-store
.git
11 changes: 10 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,13 @@ srcbook/lib/**/*
# Aide
*.code-workspace

vite.config.ts.timestamp-*.mjs
vite.config.ts.timestamp-*.mjs

# MCP Testing
packages/api/test/mcp_tests

# MCP Config
packages/api/srcbook_mcp_config.json

## PR scratch
packages/api/PR_MARKUP.md
4 changes: 4 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ COPY package.json pnpm-lock.yaml pnpm-workspace.yaml ./
COPY packages packages/
COPY srcbook srcbook/
COPY turbo.json ./
COPY srcbook_mcp_config.json ./

# Install dependencies
RUN pnpm install
Expand All @@ -18,6 +19,9 @@ RUN pnpm build
# Create necessary directories for volumes
RUN mkdir -p /root/.srcbook /root/.npm

# Set container environment variable
ENV CONTAINER=true

# Source code will be mounted at runtime
CMD [ "pnpm", "start" ]

Expand Down
137 changes: 137 additions & 0 deletions MCP_README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
# Model Context Protocol (MCP) Integration

Srcbook now features a [Model Context Protocol](https://modelcontextprotocol.io) (MCP) client, enabling secure and standardized interactions between your applications and external tools via MCP. Currently, MCP is primarily integrated with the AI-assisted app building functionality.

## Overview

MCP allows Srcbook to:
- Enhance AI code generation with sequential, o1-style thinking
- Access local files securely (when configured)
- Connect to various utility servers

> **Note**: MCP integration is currently focused on the app builder functionality, with notebook integration planned for future releases.

## Getting Started

### Prerequisites

- Srcbook running locally
- Node.js 20.x **(This is important, as Node 22.x runs into issues with Rollup and MCP)**

- Access to your local filesystem

### Configuration

1. Navigate to `packages/api/` in your Srcbook installation directory
2. Locate `srcbook_mcp_config.example.json`
3. Create a new file named `srcbook_mcp_config.json` based on the example
4. Configure the filesystem paths:

```json
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-memory"]
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/PATH/TO/YOUR/DESKTOP",
"/PATH/TO/YOUR/DOWNLOADS"
]
},
"puppeteer": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-puppeteer"]
},
"sequential-thinking": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-sequential-thinking"]
},
"everything": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-everything"]
},
"mcp-installer": {
"command": "npx",
"args": ["@anaisbetts/mcp-installer"]
}
}
}
```

> **Important**: Replace `/PATH/TO/YOUR/DESKTOP` and `/PATH/TO/YOUR/DOWNLOADS` with the actual paths to your Desktop and Downloads folders.

5. To ensure you're running Node 20.x, use the following commands in your terminal:

```bash
nvm install 20
nvm use 20
```

## Available Servers

Srcbook comes with several pre-configured MCP servers that don't require API keys:

- **memory**: Basic memory operations ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/memory))
- **filesystem**: Secure file system access ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem)) - **Note**: This server is not operational in Srcbook yet.
- **puppeteer**: Browser automation capabilities ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/puppeteer))
- **sequential-thinking**: Enhanced, o1-style reasoning ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/sequentialthinking))
- **everything**: Test server for builders of MCP clients ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/everything))
- **mcp-installer**: MCP server installation utility

## Using MCP in the App Builder

### Sequential Thinking

The primary MCP integration currently available is the sequential thinking feature in the app builder:

1. Open the app builder
2. Toggle sequential thinking on/off using the interface
3. When enabled, the AI code editing process will utilize the sequential-thinking server
4. You can verify server usage by checking your terminal output

## Troubleshooting

### Common Issues

1. **Server Configuration**
- Verify your `srcbook_mcp_config.json` exists and is properly formatted
- Check that filesystem paths are correct and accessible
- Ensure Node.js version is 18+

2. **Sequential Thinking Issues**
- Check terminal output for server connection status
- Verify the server is properly installed via npx
- Restart Srcbook if server connection fails

### Known Limitations

1. **Notebook Integration**
- MCP is not currently integrated with notebook functionality
- Future releases will expand MCP support to notebooks

2. **File Access**
- Limited to configured Desktop and Downloads directories
- Must be explicitly configured in json file

## Getting Help

- Join our [Discord Community](https://discord.gg/shDEGBSe2d)
- File issues on [GitHub](https://github.com/srcbookdev/srcbook)

## Future Development

Planned expansions of MCP functionality include:

1. Notebook integration for code cells
2. Additional server integrations
3. Enhanced file system capabilities
4. Expanded AI assistance features

## Contributing

We welcome contributions to improve MCP integration in Srcbook. Please check our [Contributing Guidelines](CONTRIBUTING.md) before submitting changes.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ Srcbook is open-source (apache2) and runs locally on your machine. You'll need t
- Create, edit and run web apps
- Use AI to generate the boilerplate, modify the code, and fix things
- Edit the app with a hot-reloading web preview
- [Model Context Protocol (MCP)](MCP_README.md) integration for enhanced AI capabilities and secure file access

<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://i.imgur.com/lLJPZOs.png">
Expand Down
5 changes: 4 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,13 @@
"typescript": "5.6.2"
},
"dependencies": {
"@modelcontextprotocol/sdk": "^1.1.0",
"drizzle-kit": "^0.24.2",
"electron": "^33.3.1",
"turbo": "^2.1.1"
},
"packageManager": "pnpm@9.12.1",
"engines": {
"node": ">=18"
"node": "20.x"
}
}
78 changes: 76 additions & 2 deletions packages/api/ai/generate.mts
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ import { PROMPTS_DIR } from '../constants.mjs';
import { encode, decodeCells } from '../srcmd.mjs';
import { buildProjectXml, type FileContent } from '../ai/app-parser.mjs';
import { logAppGeneration } from './logger.mjs';
import mcpHubInstance from '../mcp/mcphub.mjs';

const makeGenerateSrcbookSystemPrompt = () => {
return readFileSync(Path.join(PROMPTS_DIR, 'srcbook-generator.txt'), 'utf-8');
Expand All @@ -33,14 +34,17 @@ const makeAppEditorSystemPrompt = () => {
return readFileSync(Path.join(PROMPTS_DIR, 'app-editor.txt'), 'utf-8');
};

const makeAppEditorUserPrompt = (projectId: string, files: FileContent[], query: string) => {
const makeAppEditorUserPrompt = (projectId: string, files: FileContent[], query: string, sequentialthinking: {}) => {
const projectXml = buildProjectXml(files, projectId);
const userRequestXml = `<userRequest>${query}</userRequest>`;
const sequentialthinkingXml = `<sequentialthinking>${sequentialthinking}</sequentialthinking>`;
return `Following below are the project XML and the user request.

${projectXml}

${userRequestXml}

${sequentialthinkingXml}
`.trim();
};

Expand Down Expand Up @@ -267,7 +271,7 @@ export async function streamEditApp(
const model = await getModel();

const systemPrompt = makeAppEditorSystemPrompt();
const userPrompt = makeAppEditorUserPrompt(projectId, files, query);
const userPrompt = makeAppEditorUserPrompt(projectId, files, query, '');

let response = '';

Expand All @@ -294,3 +298,73 @@ export async function streamEditApp(

return result.textStream;
}

export async function streamEditAppSequential(
projectId: string,
files: FileContent[],
query: string,
appId: string,
planId: string,
mcpHub: typeof mcpHubInstance
) {
console.log('[MCP] Using sequential logic for editing app', appId, 'plan:', planId);

// 1. Call the MCP server to gather chain-of-thought or other specialized data.
const serverName = 'sequential-thinking';
const toolName = 'sequentialthinking';
const toolParams = {
thought: 'Analyze request from the user to see if we need more steps',
nextThoughtNeeded: true,
thoughtNumber: 1,
totalThoughts: 2
};

let sequentialthinking = {};
console.log('[MCP] Invoking tool:', toolName, 'on server:', serverName, 'with params:', toolParams);
try {
const result = await mcpHub.callTool(serverName, toolName, toolParams);
console.log('[MCP] Tool invocation result:', JSON.stringify(result, null, 2));
const textBlock = result.content?.[0]?.text ?? 'No content returned from sequentialthinking tool.';
sequentialthinking = textBlock;
} catch (error) {
console.error(`[MCP] Error calling ${serverName} tool:`, error);
}

// 2. Build your systemPrompt, injecting chain-of-thought data from the MCP server:
const basePrompt = makeAppEditorSystemPrompt();
const systemPrompt = `You are a helpful AI that uses a sequential chain-of-thought approach.
Here is additional context from the sequential-thinking server:
${sequentialthinking}
${basePrompt}`;

// 3. Construct your user prompt
const userPrompt = makeAppEditorUserPrompt(projectId, files, query, sequentialthinking);

let response = '';

// 4. Continue with your usual streaming logic
const result = await streamText({
model: await getModel(),
system: systemPrompt,
prompt: userPrompt,
onChunk: (chunk) => {
if (chunk.chunk.type === 'text-delta') {
response += chunk.chunk.textDelta;
console.log('[MCP] Streaming response:', response);
}
},
onFinish: async () => {
if (process.env.SRCBOOK_DISABLE_ANALYTICS !== 'true') {
logAppGeneration({
appId,
planId,
llm_request: { model: await getModel(), system: systemPrompt, prompt: userPrompt },
llm_response: response,
});
}
},
});

// 5. Return the streaming body
return result.textStream;
}
39 changes: 39 additions & 0 deletions packages/api/ai/tool-execution.mts
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
import { MCPHub } from '../mcp/mcphub.mjs';
import type { CallToolResultSchema } from '@modelcontextprotocol/sdk/types.js';
import { CallToolRequestSchema } from '../mcp/types.mjs';
import { z } from 'zod';

export class ToolExecutionService {
constructor(private mcpHub: MCPHub) {}

async executeToolStream(request: z.infer<typeof CallToolRequestSchema>): Promise<ReadableStream> {
return new ReadableStream({
start: async (controller) => {
try {
const result: z.infer<typeof CallToolResultSchema> = await this.mcpHub.callTool(
request.serverName,
request.toolName,
request.params
);

controller.enqueue(
JSON.stringify({
type: 'result',
data: result,
}) + '\n'
);

controller.close();
} catch (error: any) {
controller.enqueue(
JSON.stringify({
type: 'error',
data: { message: error.message },
}) + '\n'
);
controller.error(error);
}
},
});
}
}
1 change: 1 addition & 0 deletions packages/api/config.mts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ async function init() {
aiConfig: { provider: 'openai', model: 'gpt-4o' } as const,
aiProvider: 'openai',
aiModel: 'gpt-4o',
mcpServers: {},
};
console.log();
console.log('Initializing application with the following configuration:\n');
Expand Down
1 change: 1 addition & 0 deletions packages/api/constants.mts
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,4 @@ export const APPS_DIR = path.join(SRCBOOK_DIR, 'apps');
export const DIST_DIR = _dirname;
export const PROMPTS_DIR = path.join(DIST_DIR, 'prompts');
export const IS_PRODUCTION = process.env.NODE_ENV === 'production';
export const PROJECT_DIR = '@/srcbook_mcp_config.json';
2 changes: 1 addition & 1 deletion packages/api/db/index.mts
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ const DB_PATH = `${HOME_DIR}/.srcbook/srcbook.db`;
fs.mkdirSync(SRCBOOKS_DIR, { recursive: true });

export const db = drizzle(new Database(DB_PATH), { schema });
migrate(db, { migrationsFolder: drizzleFolder });
migrate(db, { migrationsFolder: drizzleFolder });
Loading
Loading