Changelog

0.34.x

VS Code 1.89, New Cursor Prediction UI, Gemini 1.5 Flash, Copilot++ partial completion

  • Merges VS Code 1.89 into Cursor
  • New Cursor Prediction UI
  • Gemini 1.5 Flash is available in long-context mode
  • Accept partial completions with Copilot++
  • Better performance of Copilot++ on linter errors
  • Toggleable rerankers on codebase search
  • GPT-4o in Interpreter Mode

UPDATE (0.34.1-0.34.6): Fixes long context models in the model toggle, an empty AI Review tab, Copilot++ preview bugs, the Mac icon size, and remote ssh fixes.

0.33.x

Networking Stability, Command-K Autoselect

  • Stability: This build fixes a connection error problem that was consistently affecting some users. It should also improve the performance of Cursor on spotty internet.
  • Command-K Autoselect: We've also added automatic selection for Command-K! This means you can now press Command-K, and it will automatically select the region you're working on, though you can still select manually if you prefer.

UPDATE (0.33.1-0.33.3): Fix to settings toggles, fix to Copilot++ diffbox performance, onboarding tweaks.

0.32.x

Improved Copilot++ UX, New GPT-4 Model

  • Copilot++ UX: Suggestion previews now have syntax highlighting, which we find makes it much easier to quickly understand the changes.
  • Cursor Help Pane (Beta): You can also ask Cursor about Cursor! The Cursor Help Pane has information about features, keyboard shortcuts, and much more. You can enable it in Settings > Beta.
  • New GPT-4 model: As of a couple of days ago, you can try out gpt-4-turbo-2024-04-09 in Cursor by toggling it on in Settings > Models.
  • .cursorrules: You can write down repo-level rules for the AI by creating a .cursorrules file in the root of your repository. You might use this to give for context on what you're building, style guidelines, or info on commonly-used methods.

UPDATE (0.32.1-0.32.7): Fixes a performance issue with the new Copilot++ syntax highlighting, changes AI Notes to be default disabled, changes the naming of the main Copilot++ model to legacy, fixes Copilot++ being slower over SSH, fixes to the Copilot++ preview box.

0.31.x

Long Context Chat Beta

  • Long Context Chat (Beta): This is a new experimental feature that lets you talk with lots of files! To enable it, head to Settings > Beta. Then, select "Long Context Chat" in the top right of a new chat and try @'ing a folder or the entire codebase.
  • Fixes: This release patches a bug where empty / partial responses are shown in chat.

UPDATE (0.31.1 - 0.31.3): Adds back in AI Review (alpha), fixes the "Cursor Settings" menu item, and fixes a bug where @web doesn't return a response.

0.30.x

Faster Copilot++, Claude

  • Faster Copilot++: We've made Copilot++ ~2x faster! This speed bump comes from a new model / faster inference. ~50% of users are already on this model, and it will roll out to everyone over a few days. If you'd like to enable the model immediately, you can control your model in the bottom bar of the editor.
  • Stable Claude Support: All the newest Claude models are available for Pro and API key users. Head to Settings > Models to toggle them on. Pro users get 10 requests / day for free and can keep using Claude at API-key prices for subsequent requests.
  • Team invites: We made it a bit easier for you to invite your colleagues to your Cursor team. You can send these from the editor's settings or at cursor.com/settings.
  • Admin improvements: Team admins can now mark themselves as unpaid users and can see the last time team members used the product.
  • New Settings: We moved all our settings to be accessible by the gear in the top right. No more "More" tab!