Hey guys! Question on the new response tokens supp...
# koog-agentic-framework
f
Hey guys! Question on the new response tokens support. I'm trying to implement an observability feature to log/categorize all of my llm requests. I keep getting null for the
inputTokensCount
and
outputTokensCount
using the
OpenAIModels.CostOptimized
model family. Is there any flag I need to toggle to have the LLMClient return the tokens?
🧵 1
I didn't see anything in the PR for the feature but I'm not understanding why I don't get tokens for any of these responses. Happy to share the full
Copy code
pipeline.interceptAfterLLMCall(this, featureImpl) { prompt, tools, model, responses, sessionId ->
    val records = responses.map {
        val inputTokens = it.metaInfo.inputTokensCount ?: run {
            logger.warn { "No inputTokens found for usage record." }
            0
        }
        val outputTokens = it.metaInfo.outputTokensCount ?: run {
            logger.warn { "No outputTokens found for usage record." }
            0
        }

        UsageRecord(
            userId = config.userId!!,
            sessionId = sessionId.toString(),
            featureUsed = config.feature!!,
            inputTokens = inputTokens,
            outputTokens = outputTokens,
            modelName = model.id,
            estimatedCost = null,
            timestamp = it.metaInfo.timestamp
        )
    }
    config.handleRecords(records)