Ofek Teken
06/28/2025, 10:28 PMjson_schema response format
like OpenAI and more providers state.
Libraries like LangChain4j allow you to essentially give these providers the response_format
. As far as I see, this isn't the current approach in Koog which seems rather more error-prone for my understanding? (Thus, the rising need of the fixingModel
technique in PromptExecutor.executeStructured
).
Is there any specific reason for not sending the response_format
for models that support it? e.g. in via Koog's OpenAIRequest
(OpenAI response_schema reference). I can only assume it would remove the need for fixingModel
and similar techniques, allowing a reliable type-safe
response for models that indeed support it.
One more question, out of pure interest and because I didn't catch that in the awesome KotlinConf talk - Was this project open-sourced after begin used internally in Jetbrains? If not, are there any future plans to use Koog in Jetbrains products?Boris Zubarev
06/29/2025, 6:22 PMOfek Teken
06/29/2025, 7:25 PMAndrey Bragin
06/30/2025, 11:43 AMI can only assume it would remove the need forAFAIK (based on the experience of some of our colleagues) there’s still a chance to get malformed response, even in the “strict mode”, so it would be nice to keepand similar techniques, allowing a reliablefixingModel
response for models that indeed support it.type-safe
fixingModel
to offer more flexibility and reliability. And there’s already PromptExecutor.executeStructuredOneShot that offers simplified approach without fixingModel
, essentially assuming that the structured response would be valid after the first tryAndrey Bragin
06/30/2025, 11:47 AMOne more question, out of pure interest and because I didn’t catch that in the awesome KotlinConf talk - Was this project open-sourced after begin used internally in Jetbrains?Yes, it was developed initially as an internal SDK to help us integrate AI features (AI agents especially) into our products.
Ofek Teken
06/30/2025, 3:19 PMfixingModel
approach has it's place and can keep being used in PromptExecutor.executeStructured
and the ones that "trust" these models may simply use PromptExecutor.executeStructuredOneShot
as you've suggested.
Definitely waiting on this one 👀