Priyanshu Jain
05/29/2025, 12:32 PMPriyanshu Jain
05/29/2025, 12:33 PMFATAL EXCEPTION: main
Process: <http://com.test.app|com.test.app>, PID: 8654
io.ktor.serialization.JsonConvertException: Illegal input: Fields [model, done] are required for type with serial name 'ai.koog.prompt.executor.ollama.client.dto.OllamaChatResponseDTO', but they were missing at path: $
at io.ktor.serialization.kotlinx.KotlinxSerializationConverter.deserialize(KotlinxSerializationConverter.kt:77)
at io.ktor.serialization.ContentConverterKt$deserialize$$inlined$map$1$2.emit(Emitters.kt:51)
at kotlinx.coroutines.flow.FlowKt__BuildersKt$asFlow$$inlined$unsafeFlow$3.collect(SafeCollector.common.kt:111)
at io.ktor.serialization.ContentConverterKt$deserialize$$inlined$map$1.collect(SafeCollector.common.kt:109)
at kotlinx.coroutines.flow.FlowKt__ReduceKt.firstOrNull(Reduce.kt:247)
at kotlinx.coroutines.flow.FlowKt.firstOrNull(Unknown Source:1)
at io.ktor.serialization.ContentConverterKt.deserialize(ContentConverter.kt:99)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiationKt.ContentNegotiation$lambda$13$convertResponse(ContentNegotiation.kt:234)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiationKt.access$ContentNegotiation$lambda$13$convertResponse(ContentNegotiation.kt:1)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiationKt$ContentNegotiation$2$2.invokeSuspend(ContentNegotiation.kt:249)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiationKt$ContentNegotiation$2$2.invoke(Unknown Source:19)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiationKt$ContentNegotiation$2$2.invoke(Unknown Source:10)
at io.ktor.client.plugins.api.TransformResponseBodyHook$install$1.invokeSuspend(KtorCallContexts.kt:105)
at io.ktor.client.plugins.api.TransformResponseBodyHook$install$1.invoke(Unknown Source:11)
at io.ktor.client.plugins.api.TransformResponseBodyHook$install$1.invoke(Unknown Source:6)
at io.ktor.util.pipeline.DebugPipelineContext.proceedLoop(DebugPipelineContext.kt:79)
at io.ktor.util.pipeline.DebugPipelineContext.proceed(DebugPipelineContext.kt:57)
at io.ktor.client.HttpClient$4.invokeSuspend(HttpClient.kt:1379)
at io.ktor.client.HttpClient$4.invoke(Unknown Source:11)
at io.ktor.client.HttpClient$4.invoke(Unknown Source:6)
at io.ktor.util.pipeline.DebugPipelineContext.proceedLoop(DebugPipelineContext.kt:79)
at io.ktor.util.pipeline.DebugPipelineContext.proceed(DebugPipelineContext.kt:57)
at io.ktor.client.plugins.logging.ReceiveHook$Context.proceed(Logging.kt:290)
at io.ktor.client.plugins.logging.LoggingKt$Logging$2$3.invokeSuspend(Logging.kt:209)
at io.ktor.client.plugins.logging.LoggingKt$Logging$2$3.invoke(Unknown Source:13)
at io.ktor.client.plugins.logging.LoggingKt$Logging$2$3.invoke(Unknown Source:6)
at io.ktor.client.plugins.logging.ReceiveHook$install$1.invokeSuspend(Logging.kt:298)
at io.ktor.client.plugins.logging.ReceiveHook$install$1.invoke(Unknown Source:11)
at io.ktor.client.plugins.logging.ReceiveHook$install$1.invoke(Unknown Source:6)
at io.ktor.util.pipeline.DebugPipelineContext.proceedLoop(DebugPipelineContext.kt:79)
at io.ktor.util.pipeline.DebugPipelineContext.proceed(DebugPipelineContext.kt:57)
at io.ktor.client.plugins.ReceiveError$install$1.invokeSuspend(HttpCallValidator.kt:149)
at io.ktor.client.plugins.ReceiveError$install$1.invoke(Unknown Source:11)
at io.ktor.client.plugins.ReceiveError$install$1.invoke(Unknown Source:6)
at io.ktor.util.pipeline.DebugPipelineContext.proceedLoop(DebugPipelineContext.kt:79)
at io.ktor.util.pipeline.DebugPipelineContext.proceed(DebugPipelineContext.kt:57)
at io.ktor.util.pipeline.DebugPipelineContext.execute$ktor_utils(DebugPipelineContext.kt:63)
at io.ktor.util.pipeline.Pipeline.execute(Pipeline.kt:79)
at io.ktor.client.call.HttpClientCall.bodyNullable(HttpClientCall.kt:86)
at ai.koog.prompt.executor.ollama.client.OllamaClient.execute(OllamaClient.kt:171)
at ai.koog.prompt.executor.ollama.client.OllamaClient$execute$1.invokeSuspend(Unknown Source:15)
Priyanshu Jain
05/29/2025, 12:35 PMclass MainActivity : ComponentActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
enableEdgeToEdge()
super.onCreate(savedInstanceState)
setContent {
val viewModel by viewModels<MainViewModel>()
Box(modifier = Modifier.fillMaxSize()) {
viewModel.runQuery("Hey!")
}
}
}
}
And,
class MainViewModel: ViewModel() {
private val agent = simpleSingleRunAgent(
executor = simpleOllamaAIExecutor("<http://10.0.2.2:11434>"),
systemPrompt = """
You are Beacon, a helpful assistant.
""",
llmModel = OllamaModels.Meta.LLAMA_3_2
)
fun runQuery(query: String) {
viewModelScope.launch {
agent.runAndGetResult(query)
}
}
}
I am using 10.0.2.2:11434 since I am testing this on an emulator and Ollama is running on the host system.Vadim Briliantov
05/29/2025, 12:50 PMVadim Briliantov
05/29/2025, 1:17 PMpackage ai.koog.agents.example.simpleapi
import ai.koog.agents.ext.agent.simpleSingleRunAgent
import ai.koog.prompt.executor.llms.all.simpleOllamaAIExecutor
import ai.koog.prompt.llm.OllamaModels
import kotlinx.coroutines.runBlocking
fun main() {
val agent = simpleSingleRunAgent(
executor = simpleOllamaAIExecutor("<http://localhost:11434>"),
systemPrompt = """
You are Beacon, a helpful assistant.
""",
llmModel = OllamaModels.Meta.LLAMA_3_2
)
runBlocking {
println(agent.runAndGetResult("Hey!"))
}
}
…
15:13:15.213 [main] INFO ai.koog.agents.core.agent.AIAgent - Sending final result [single_run, 23390380-b943-4945-adbe-9d05860c7dfa]
Hello! It's nice to meet you. Is there anything I can help you with or would you like to chat?
BUILD SUCCESSFUL in 3s
Also tried to run it from a separate repository with 0.1.0
published version of Koog — also works:
LF4J(W): See <https://www.slf4j.org/codes.html#noProviders> for further details.
Hello! How can I assist you today? Do you have any questions or need help with something specific?
Process finished with exit code 0
Vadim Briliantov
05/29/2025, 1:17 PMVadim Briliantov
05/29/2025, 1:22 PM╰─➤ ollama -v
ollama version is 0.6.8
Vadim Briliantov
05/29/2025, 1:24 PMai.koog.prompt.executor.ollama.client.OllamaClient
?
Before .body<OllamaChatResponseDTO>()
What would be the response if you evaluate the expression .bodyAsText()
instead? Let’s check what data your Ollama version responds here:Priyanshu Jain
05/29/2025, 1:46 PMcaelum19
05/29/2025, 1:55 PMPriyanshu Jain
05/29/2025, 2:13 PM{
"error": "model \"llama3.2\" not found, try pulling it first"
}
This is because when pulling the model from ollama, if I pull using
ollama pull llama3.2:3b
Ollama saves it as llama3.2:3b but if I pull using
ollama pull llama3.2
It pulls the same model but with the tag llama3.2:latest after which this is what I get:
{
"model": "llama3.2",
"created_at": "2025-05-29T14:13:01.195511Z",
"message": {
"role": "assistant",
"content": "Hello! How can I assist you today? Do you have any questions or topics you'd like to discuss? I'm all ears (or in this case, all text)!"
},
"done_reason": "stop",
"done": true,
"total_duration": 1240903250,
"load_duration": 26085459,
"prompt_eval_count": 38,
"prompt_eval_duration": 545364750,
"eval_count": 36,
"eval_duration": 668195541
}
So, it was a tag issue 🤦♂️.
But, this does mean that the error case isn't being serialized properly which causes a crash.Priyanshu Jain
05/29/2025, 2:17 PMVadim Briliantov
05/29/2025, 3:21 PM