EDIT: Awesome - true native Structured Output supp...
# koog-agentic-framework
d
EDIT: Awesome - true native Structured Output support was merged two days ago here kodee loving kodee happy ❤️ 🚀 Redundant post about lack of Structured Output support follows 🙃 I was surprised to see that
PromptExecutor.executeStructured
takes the naive approach of 'asking the LLM nicely' to output according to a JSON Schema - injecting it in text-form into the user prompt. I can guess this approach may have been implemented first, because it works in the general provider case, where we don't know the capabilities of the underlying inference engine. However it is a weak approach for obvious reasons - LLM's will frequently add 'fluff' to their output requiring fixing. ("Sure here's your JSON!..." - fail 🙅 ) OpenAI endpoints (at least) support first-class specification of a JSON Schema, which is fed into the inference engine, putting hard grammatical constraints on the next token that can be predicted - completely robust and more performant (fewer input tokens, no retries, perhaps even faster to predict next token with fewer choices). This OpenAI page has an excellent, detailed explanation of the problem and their first-class solution. Bringing it back to K : Support for this feature was added to the popular Kotlin library
aalam/openai-kotlin
here, which I previously tested as working against both OpenAI proper and the compatible endpoint offered by LMStudio for locally hosted models (yes, local inference engines also support constrained grammar properly! kodee happy). I hope the choice of deriving a JSON Schema was done with adding this support in mind - but I can't see any mention of it in the GitHub Issues kodee sad 💡 For me, this feature is the most important addition to Koog above anything else. For programmatic use-cases; the difference in speed and accuracy is night & day, with this feature enabled.
🙌 3
K 3
🙌🏽 1
❤️ 3
s
@Anastasiia Zarechneva The PR mentions that this it includes an update to "OpenAI and Google clients to support native structured output". Any ideas if this just comes as standard on AWS Bedrock?
e
Awesome! this will help if we need to decode the Json for example to show a nice UI response in Agent App
l
I took the JSON schema support as given when I first saw the Koog announcement. I would imagine we all want to be able to annotate our functions with some annotation that will then automatically prepare the types and parse the response. This was so far missing in aalam's LLM library and no one else seemed to tackle this. When Koog announced that they will handle everything LLM in a Kotlin nice way as we love it, I was really excited and I thought for sure what Koog is about. But after trying it out I was a bit disappointed. It seems that it is mostly focused on the "agent" part and seems to skip the basics. I'm curious by the way, if anyone uses agents in Koog? I wonder who asked for them. I don't mean to sound bad, I really hope that Koog will be the best library out there for LLMs for Kotlin. I think agents and memory and context compaction strategies are important and great, we just need to also focus on the simple usage first. K
💯 1
2
d
⚠️ I definitely think the Structured Output merge (above) is worth a Koog
0.4.0
release all by itself - this is critical functionality for programmatic use-cases (most, I would have thought?) Koog team know priorities best, but from the outside, it'd be really nice to see this released soon! 🙏
❤️ 2
👌 1
o
As soon as I saw Koog, I tried playing with structured outputs, but after fiddling around in the source code and saw their initial approach - I felt the immediate urgency to let the team know about this and open the github issue "Json Schema / Response Schema Support", and the rest is history. Thanks to the great team and specifically to @Andrey Bragin for quickly opening a new branch and starting to work on a fix. I truly can't wait to use this great piece of functionality!
blob ty sign 2
🙌 2
K 5
🙌🏽 1
s
Ah darn this is amazing!