Hi, is there a plan to make the LLMClient path con...
# koog-agentic-framework
m
Hi, is there a plan to make the LLMClient path configurable? e.g. https://github.com/JetBrains/koog/blob/317ab9c522c6b871fb4cab8f95004e72b1fbd168/pr[…]otlin/ai/koog/prompt/executor/clients/openai/OpenAILLMClient.kt The DEFAULT_MESSAGE_PATH is not overridable right now. Should I create an issue?
d
What's your use-case for changing the API endpoint? Because currently, you can change the base url by changing
settings.baseUrl
.
m
It is a custom Azue OpenAI company deployment that simply has a slighlty different chat-completions endpoint
d
Ah, yes, you will have to open an issue for that. Or maybe you can ask your IT to set the endpoints as the standard is? Because I believe you have the same problem with every tool and framework that uses the standard endpoints, right?
m
for example, in case of Spring AI, you can configure that as well. I think the solution should be on the tooling site. But it might be an opportunity for me to contribute 🙂
👍 1
v
Hi! Thanks for your request. Sure, please feel free to create an issue for that, or open an MR. It can actually be a part of the
private val settings: OpenAIClientSettings
m
Hi, thanks Vadim, I was thinking the same 🙂
I opened a PR and created an issue.
🙏 1
K 1