Hey :wave:, running `SingleLLMPromptExecutor(clien...
# koog-agentic-framework
d
Hey 👋, running `SingleLLMPromptExecutor(client).execute(prompt, model, listOf())`/`agent.runAndGetResult("some prompt")` with a OpenAILLMClient returns “Error from OpenAI API: 400 Bad Request: JSON format issue: expected string at field messages.content got instead array”. It appears that the user message is created as an object, rather than as a string, unlike the system prompt. Is this by design, or am I using it incorrectly for a simple prompt?
f
Could you share more of the code surrounding the issue? I use
SingleLLMPrompt
and
runAndGetResult
all the time with OpenAI with no issues. And I couldn't reproduce the bug with the code you gave with a minimal example. Probably sharing the prompt would be most helpful.
Copy code
val client = OpenAILLMClient(API_KEY)
val prompt = prompt("test") {
    user("Finish this sentence. I am a...")
}
val responses = SingleLLMPromptExecutor(client).execute(prompt, OpenAIModels.Chat.GPT4_1, emptyList())
println(responses)
d
Hi, thanks for response. I think the problem might be that koog expects the more advanced API that supports multi-media Input. I am running against an OpenAI compatible API that supports chat completions hosted by IONOS.
Copy code
val client = OpenAILLMClient(apiKey = apiKey, settings = settings)
val prompt = prompt("test_prompt", LLMParams()) {
    user("Whats the captial of germany?")
}
val model = LLModel(LLMProvider.OpenAI, "meta-llama/Llama-3.3-70B-Instruct", listOf(LLMCapability.Completion))

val response = SingleLLMPromptExecutor(client).execute(prompt, model, listOf())
The default OpenAI chat completion API (https://platform.openai.com/docs/api-reference/chat/create) uses the messages format where content is just a string. But debugging the request it becomes:
Copy code
"messages" : [ {
    "role" : "system",
    "content" : "...."
  }, {
    "role" : "user",
    "content" : [ {
      "text" : "...",
      "type" : "text"
    } ]
  } ]
I looked into the code and found that the system message becomes a Content.Text unlike the user message which will be a ContentPart and ends up in an array of content for the user message. I would expect it uses the normal messages format if I am just passing a text prompt.
👍 1
f
Ahh yeah that makes sense...maybe try the Ollama client? I haven't tried this or looked at it yet but I know it doesn't support anything but text so maybe it has better support for converting to the older formats?
d
The Ollama message conversion would be the correct one, but using Ollama is not an option for us. As the official doc from OpenAI supports the default format for
v1/chat/completions
I don’t see any reason to not support that format for simple text prompts and use the richer format for media input etc. The only option for now is to abuse the system message for user input as its rendered with content string instead of list.
a
Hi, thanks for noticing this. I see the cause of such behavior in our code, I’ll create an issue to fix it.
d
Thank you for addressing this so quickly 🙂 I appreciate it and looking forward to the fix.