If first impressions matter; my first with Koog wa...
# koog-agentic-framework
d
If first impressions matter; my first with Koog was a fail - I just wanted to point it at my 'OpenAI compatible endpoint' (hosted by LMStudio) - for inference. But the
LLModel
approach seems disappointingly inflexible.... please just let me provide an OpenAI API endpoint URL, not only choose from a preset list of providers 😞 ...like just about every other AI tool in existence 🤷 Did I miss something?
...already lost me, going back to openai-kotlin
I just can't understand why you'd design it like that, hiding the important stuff like endpoint used. Really annoying dev experience TBH
kodee angry
Ok, so possible. But way less than obvious. The current model seems to 'hide' this. By trying to be too easy to use, it becomes super developer unfriendly.
a
How would you suggest making it more developer friendly?
d
simpleOpenAIExecutor(baseUrl: String = "<OpenAI Default>", apiToken: String)
☝️ That would have worked for me; having the OpenAI executor optionally take an custom endpoint.
Which is one step away from Koog's initial 'Hello World' prompt example, and would have been within the API space I searched before getting upset.
I mean, it's possible I'm just having a bad day and Koog is fine, but I'm gonna suggest a lot of Devs will approach Koog with 'I'm running my own local OpenAI compatible endpoint, let's hook it up', and unless I missed something the current docs & example do not support that approach.
a
its in the API docs that you can construct an
OpenAILLMClient
with a
settings
object that contains your custom
baseUrl
https://api.koog.ai/prompt/prompt-executor/prompt-executor-clients/prompt-executor[…]rompt.executor.clients.openai/-open-a-i-l-l-m-client/index.html
But I agree its not clearly documented and supporting it with simpleOpenAIExecutor would be convenient
👍 2
d
Thanks.
v
Hi! Thanks for sharing your feedback, we hope we can improve on this experience. Discussed with the team and created an issue https://youtrack.jetbrains.com/issue/KG-190 for adding baseURL params to all simple* executors, and also we’ll add a separate dedicated documentation page describing how to connect to a self-hosted/custom LLM describing all the possibilities. Please let us know if there’s something else you find missing or that could be improved, it’s valuable for us to know the user opinions.
❤️ 1