Skip to content

Update dependency ai.koog:koog-agents to v0.3.0 #157

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Jul 15, 2025

This PR contains the following updates:

Package Change Age Confidence
ai.koog:koog-agents 0.2.1 -> 0.3.0 age confidence

Release Notes

JetBrains/koog (ai.koog:koog-agents)

v0.3.0

Compare Source

Published 15 Jul 2025

Major Features

  • Agent Persistency and Checkpoints: Save and restore agent state to local disk, memory, or easily integrate with
    any cloud storages or databases. Agents can now roll back to any prior state on demand or automatically restore from
    the latest checkpoint (#​305)
  • Vector Document Storage: Store embeddings and documents in persistent storage for retrieval-augmented generation (
    RAG), with in-memory and local file implementations (#​272)
  • OpenTelemetry Support: Native integration with OpenTelemetry for unified tracing logs across AI agents (#​369, #​401,
    #​423, #​426)
  • Content Moderation: Built-in support for moderating models, enabling AI agents to automatically review and filter
    outputs for safety and compliance (#​395)
  • Parallel Node Execution: Parallelize different branches of your agent graph with a MapReduce-style API to speed up
    agent execution or to choose the best of the parallel attempts (#​220, #​404)
  • Spring Integration: Ready-to-use Spring Boot starter with auto-configured LLM clients and beans (#​334)
  • AWS Bedrock Support: Native support for Amazon Bedrock provider covering several crucial models and services (
    #​285, #​419)
  • WebAssembly Support: Full support for compiling AI agents to WebAssembly (WASM) for browser deployment (#​349)

Improvements

  • Multimodal Data Support: Seamlessly integrate and reason over diverse data types such as text, images, and audio (
    #​277)
  • Arbitrary Input/Output Types: More flexibility over how agents receive data and produce responses (#​326)
  • Improved History Compression: Enhanced fact-retrieval history compression for better context management (#​394,
    #​261)
  • ReAct Strategy: Built-in support for ReAct (Reasoning and Acting) agent strategy, enabling step-by-step reasoning
    and dynamic action taking (#​370)
  • Retry Component: Robust retry mechanism to enhance agent resilience (#​371)
  • Multiple Choice LLM Requests: Generate or evaluate responses using structured multiple-choice formats (#​260)
  • Azure OpenAI Integration: Support for Azure OpenAI services (#​352)
  • Ollama Enhancements: Native image input support for agents running with Ollama-backed models (#​250)
  • Customizable LLM in fact search: Support providing custom LLM for fact retrieval in the history (#​289)
  • Tool Execution Improvements: Better support for complex parameters in tool execution (#​299, #​310)
  • Agent Pipeline enhancements: More handlers and context available in AIAgentPipeline (#​263)
  • Default support of tools and messages mixture: Simple single run strategies variants for multiple message and
    parallel tool calls (#​344)
  • ResponseMetaInfo Enhancement: Add additionalInfo field to ResponseMetaInfo (#​367)
  • Subgraph Customization: Support custom LLModel and LLMParams in subgraphs, make nodeUpdatePrompt a
    pass-through node (#​354)
  • Attachments API simplification: Remove additional content builder from MessageContentBuilder, introduce
    TextContentBuilderBase (#​331)
  • Nullable MCP parameters: Added support for nullable MCP tool parameters (#​252)
  • ToolSet API enhancement: Add missing tools(ToolSet) convenience method for ToolRegistry builder (#​294)
  • Thinking support in Ollama: Add THINKING capability and it's serialization for Ollama API 0.9 (#​248)
  • kotlinx.serialization version update: Update kotlinx-serialization version to 1.8.1
  • Host settings in FeatureMessageRemoteServer: Allow configuring custom host in FeatureMessageRemoteServer (#​256)
    Victor Sima* 6/10/25, 20:32

Bug Fixes

  • Make CachedPromptExecutor and PromptCache timestamp-insensitive to enable correct caching (#​402)
  • Fix requestLLMWithoutTools generating tool calls (#​325)
  • Fix Ollama function schema generation from ToolDescriptor (#​313)
  • Fix OpenAI and OpenRouter clients to produce simple text user message when no attachments are present (#​392)
  • Fix intput/output token counts for OpenAILLMClient (#​370)
  • Using correct Ollama LLM provider for ollama llama4 model (#​314)
  • Fixed an issue where structured data examples were prompted incorrectly (#​325)
  • Correct mistaken model IDs in DEFAULT_ANTHROPIC_MODEL_VERSIONS_MAP (#​327)
  • Remove possibility of calling tools in structured LLM request (#​304)
  • Fix prompt update in subgraphWithTask (#​304)
  • Removed suspend modifier from LLMClient.executeStreaming (#​240)
  • Fix requestLLMWithoutTools to work properly across all providers (#​268)

Examples

  • W&B Weave Tracing example
  • LangFuse Tracing example
  • Moderation example: Moderating iterative joke-generation conversation
  • Parallel Nodes Execution example: Generating jokes using 3 different LLMs in parallel, and choosing the funniest one
  • Snapshot and Persistency example: Taking agent snapshots and restoring its state example

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants