Skip to content

Instantly share code, notes, and snippets.

@jeff-slang
Last active December 1, 2025 16:32
Show Gist options
  • Select an option

  • Save jeff-slang/2823014d644d78dda1f17ce35cebe189 to your computer and use it in GitHub Desktop.

Select an option

Save jeff-slang/2823014d644d78dda1f17ce35cebe189 to your computer and use it in GitHub Desktop.
package com.jeff.example.koog
import ai.koog.agents.core.agent.AIAgent
import ai.koog.agents.core.agent.AIAgentFunctionalStrategy
import ai.koog.agents.core.agent.functionalStrategy
import ai.koog.agents.core.dsl.extension.asAssistantMessage
import ai.koog.prompt.dsl.Prompt
import ai.koog.prompt.executor.clients.google.GoogleModels
import ai.koog.prompt.executor.llms.all.simpleGoogleAIExecutor
import kotlinx.coroutines.runBlocking
private const val SYSTEM_PROMPT: String = "You are a helpful assistant that thinks the meaning of life is 42"
/**
* This Prompt Storage shows one mechanism that we can hold onto session state between runs of a
* koog agent. Each *run* corresponds to a turn in a conversation.
*
* In a real server, I'd expect this prompt storage would be more like a map keyed off of the
* sessionId or some other type of in-memory or cloud based data storage.
*/
private var promptStorage: Prompt? = null
private val sessionStateFunctionStrategy: AIAgentFunctionalStrategy<String, String> =
functionalStrategy { input ->
var responses =
// Write the existing chat history to the session
llm.writeSession {
promptStorage?.run { prompt = this }
appendPrompt { user(input) }
requestLLMMultiple()
}
// Save the latest prompt to storage
llm.readSession { promptStorage = prompt }
responses.single().asAssistantMessage().content
}
private fun agentFactory(): AIAgent<String, String> =
AIAgent(
promptExecutor =
simpleGoogleAIExecutor(
System.getenv("GOOGLE_API_KEY") ?: throw Exception("LLM API Key is required")),
systemPrompt = SYSTEM_PROMPT,
llmModel = GoogleModels.Gemini2_0Flash,
strategy = sessionStateFunctionStrategy,
)
// Example multi-turn run
fun main() = runBlocking {
listOf(
"Can you help me with xyz?",
"What is the meaning of life?",
"Can you summarize our conversation so far?",
).forEach { userQuery->
println("User says: $userQuery")
val result: String = agentFactory().run(userQuery)
println("Agent says: $result")
}
}
@jeff-slang
Copy link
Author

Example functional agent strategy using https://docs.koog.ai/sessions/

@jeff-slang
Copy link
Author

The output from running this main is:

User says: Can you help me with xyz?
Agent says: I'd love to help! What is it that you need help with?

User says: What is the meaning of life?
Agent says: The meaning of life is 42.

User says: Can you summarize our conversation so far?
Agent says: We've had a brief but meaningful exchange. I introduced myself as a helpful assistant whose purpose is to help, and I let you know that the meaning of life is 42. Then I confirmed that I was ready to help you with something and waited for your request. Finally, I answered your question about the meaning of life.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment