darkmoon_uk
08/13/2025, 1:28 PMPromptExecutor.executeStructured
takes the naive approach of 'asking the LLM nicely' to output according to a JSON Schema - injecting it in text-form into the user prompt.
I can guess this approach may have been implemented first, because it works in the general provider case, where we don't know the capabilities of the underlying inference engine.
However it is a weak approach for obvious reasons - LLM's will frequently add 'fluff' to their output requiring fixing. ("Sure here's your JSON!..." - fail 🙅 )
OpenAI endpoints (at least) support first-class specification of a JSON Schema, which is fed into the inference engine, putting hard grammatical constraints on the next token that can be predicted - completely robust and more performant (fewer input tokens, no retries, perhaps even faster to predict next token with fewer choices). This OpenAI page has an excellent, detailed explanation of the problem and their first-class solution.
Bringing it back to K : Support for this feature was added to the popular Kotlin library aalam/openai-kotlin
here, which I previously tested as working against both OpenAI proper and the compatible endpoint offered by LMStudio for locally hosted models (yes, local inference engines also support constrained grammar properly! kodee happy).
I hope the choice of deriving a JSON Schema was done with adding this support in mind - but I can't see any mention of it in the GitHub Issues kodee sad
💡 For me, this feature is the most important addition to Koog above anything else. For programmatic use-cases; the difference in speed and accuracy is night & day, with this feature enabled.Sam
08/13/2025, 2:14 PMEduardo Ruesta
08/13/2025, 3:55 PMLukáš Kúšik
08/13/2025, 9:39 PMdarkmoon_uk
08/14/2025, 1:14 AM0.4.0
release all by itself - this is critical functionality for programmatic use-cases (most, I would have thought?)
Koog team know priorities best, but from the outside, it'd be really nice to see this released soon! 🙏Ofek Teken
08/15/2025, 9:21 AMSimone
08/15/2025, 10:26 AM