Skip to content

Conversation

@ServeurpersoCom
Copy link
Collaborator

@ServeurpersoCom ServeurpersoCom commented Nov 25, 2025

Make sure to read the contributing guidelines before submitting a PR

  • multi-transport MCP client
  • full agentic orchestrator
  • isolated, idempotent singleton initialization
  • typed SSE client
  • normalized tool-call accumulation pipeline
  • integrated reasoning, timings, previews, and turn-limit handling
  • complete UI section for MCP configuration
  • dedicated controls for relevant parameters
  • opt-in ChatService integration that does not interfere with existing flows

TODO: increase coupling with the UI for structured tool-call result rendering, including integrated display components and support for sending out-of-context images (persistence/storage still to be defined).

1 2 3 4
llama-webui-mcp-client.mp4

@allozaur
Copy link
Collaborator

allozaur commented Dec 1, 2025

@ServeurpersoCom this PR needs updating after #17470

@ServeurpersoCom ServeurpersoCom marked this pull request as ready for review December 2, 2025 12:14
@github-actions github-actions bot added the server label Dec 2, 2025
@banyan-god
Copy link

Cant wait for this to be merged. In addition are there any other ways to add tool calls ? Maybe like how custom gpts do

@ServeurpersoCom
Copy link
Collaborator Author

To answer your question: MCP is flexible enough to cover pretty much any tool-calling scenario you can think of. You can build RAG frontends, agentic development sandboxes, query personal databases or local APIs, even hook up stable-diffusion.cpp behind an MCP server: the protocol doesn't care what's on the other side.

That said, this implementation is a pure client-side approach (browser). If you want to reach MCP servers beyond 127.0.0.1, you'll need either a backend proxy or a small home server setup to handle CORS.

As for the merge timeline: I need to do a few more code review passes myself and validate the architecture with @allozaur before this is ready. Patience :)

@jacekpoplawski
Copy link
Contributor

Would it be a good idea to add some sample server to make it easy to start using the tools? (If I understand correctly, it could be a simple Python script?)

@ServeurpersoCom
Copy link
Collaborator Author

Yes, we'll be able to do lots of fun things, in different languages ​​including Python, and there are already plenty of ready-made MCP server examples on GitHub. Basically, anything that runs on 127.0.0.1 with Streamable-HTTP or WebSocket.
With a small Python script running locally, you can wire up pretty much anything an LLM might find useful

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants