All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, adheres to Semantic Versioning, and is generated by Changie.
- Tool use
- Support for Model Context Protocol
- hotfix: context providers no longer hidden when not in edit mode
- Hotfix for Ollama onboarding
- OpenAI predicted outputs support
- Improve codebase retrieval with BM25
- Support for Grok from xAI
- Chat enhancements including sticking input to bottom
- New UI for cmd+I in sidebar
- Support for Nebius LLM provider
- Support for Ask Sage LLM provider
- Improved reference for config.json
- New @web context provider
- Updates for llama3.2
- .continuerules file to set system prompt
- .prompt files v2
- Dedicated UI for docs indexing
- Clickable code symbols in chat
- Use clipboard content as autocomplete context
- Improved @docs crawler
- Many improvements to make autocomplete more eager
- Near complete type definition retrieval for TypeScript autocomplete
- Remove footer from chat sidebar
- Brought back the Apply button for all code blocks
- Automatically update codebase index on removed files
- New Edit mode in sidebar (cmd/ctrl+I)
- Significantly faster and more accurate docs crawler by default
- Web context provider
- Cerebras inference provider
- Automatic descriptions for previous chats
- Discord context provider
- Improved full screen UI
- Easier way to accept/reject/re-prompt after cmd/ctrl+I
- Hotfix: throttle transformers.js embeddings provider
- Improved loading/accept/reject UI for apply
- Polished chat sidebar, especially context dropdown
- Further prompt caching with Anthropic
- Updated tutorial file
- Improved styling on "More" page
- Continue for teams auth bug fix
- Fixed a number of apply bugs
- Fixed autoscrolling behavior
- Use Chromium only as a fallback after asking user
- Redesigned onboarding flow
- Fixed CRLF bug causing diff streams to treat every line as changed on Windows
- Hotfix for ability to use more than one inline context provider
- Hotfix: submenu context providers
- Improved indexing progress UI
- Improved @codebase using repomap
- Repo map context provider
- Many small UI improvements
- Fixes db.search not a function
- Use headless browser for crawling to get better results
- TTS support in the chat window
- Improved support for WatsonX models
- Fixed several small indexing bugs
- new /onboard slash command
- Fixed problem loading config.ts
- Fixed bug causing duplicate indexing work
- Support for Llama 3.1 and gpt-4o-mini
- Support for WatsonX+Granite models
- Significant improvements to indexing performance
- Improved @codebase quality by more accurately searching over file names and paths
- Improved @codebase accuracy
- Further improvements to indexing performance
- Improved docs indexing and management
- Fixed Gemini embeddings provider
- Improved indexing reliability and testing
- Quick Actions: use CodeLens to quickly take common actions like adding docstrings
- Support for Gemini 1.5 Pro
- Link to code in the sidebar when using codebase retrieval
- Smoother onboarding experience
- .prompt files, a way of saving and sharing slash commands
- Support for Claude 3.5 Sonnet, Deepseek Coder v2, and other new models
- Support for comments in config.json
- Specify multiple autocomplete models and switch between them
- Improved bracket matching strategy reduces noisy completions
- Numerous reliability upgrades to codebase indexing
- Support for improved retrieval models (Voyage embeddings/reranking)
- New @code context provider
- Personal usage analytics
- Tab-autocomplete in beta
- Image support
- Full-text search index for retrieval
- Docs context provider
- CodeLlama-70b support
- config.ts only runs in NodeJS, not browser
- Fixed proxy setting in config.json
- Add codellama and gemini to free trial, using new server
- Local codebase syncing and embeddings using LanceDB
- Improved VS Code theme matching
- Updates to packaging to download native modules for current platform (lancedb, sqlite, onnxruntime, tree-sitter wasms)
- Context providers now run from the extension side (in Node.js instead of browser javascript)
- disableSessionTitles option in config.json
- Use Ollama /chat endpoint instead of raw completions by default, and /show endpoint to gather model parameters like context length and stop tokens
- support for .continuerc.json in root of workspace to override config.json
- Inline context providers
- cmd+shift+L with new diff streaming UI for edits
- Allow certain LLM servers to handle templating
- Context items are now kept around as a part of past messages, instead of staying at the main input
- No more Python server - Continue runs entirely in Typescript
- migrated to .json config file format
- Full screen mode
- StackOverflow slash command to augment with web search
- VS Code context menus: right click to add code to context, debug the terminal, or share your Continue session
- Reliability improvements to JetBrains by bringing up-to-date with the socket.io refactor
- Codebase Retrieval: Use /codebase or cmd+enter and Continue will automatically gather the most important context
- Switch from Websockets to Socket.io