Hi Team, I see the MCP Client in http4k currently ...
# http4k
g
Hi Team, I see the MCP Client in http4k currently doesn't have a way to connect to LLM. Do I understand it right? What I understand is that it's currently only a means to call the MCP Capability directly. Isn't it different from the MCP standard client, which takes Natural language input from the user and uses LLM to figure out the right tool to call?
d
yes. The client is for connecting and invoking MCP servers only. If you want to connect to an LLM then you need to use a library to do that (there are various connect modules or you can use langchain4j with an http4k http client plugged in.
When you say "MCP standard client" what exactly are you referring to?
g
I am referring to MCP client query processing as per the spec: For Client Developers - Model Context Protocol Do we have anything similar in http4k?
d
There is currently no bridge from LLM tool use to MCP. We are considering adding this mechanism though - we need to create a common shim for LLM requests/responses first and then it should be pretty easy. Kind of like the Langchain4k Chat model.
👍🏼 1
a
Related to this @dave : https://www.http4k.org/ecosystem/http4k/reference/mcp/http4k-mcp-desktop <- this link is currently broken
d
@Arnab thanks - where is that link from ? 🙂
d
Just an update on the MCP stuff. We've got quite a lot of ideas in the pipe for this bridge and it's the next thing to work on immediately. In the meantime, the new version 6.12.0.0 of http4k (just released) has a couple of breaking changes: 1. the modules are now published under http4k-ai-mcp-sdk (instead of the old http4k-mcp-sdk). This is to align with broader changes we're going to be making to the project and consolidation of concepts etc. 2. in this, the package names have also changed:
org.http4k.mcp
is now
org.http4k.ai.mcp
. A simple "replace all" will fix the code right up.
http4k 1
👍🏼 1