Back to Ollama

JetBrains

docs/integrations/jetbrains.mdx

0.23.11.2 KB
Original Source

<Note>This example uses IntelliJ; same steps apply to other JetBrains IDEs (e.g., PyCharm).</Note>

Install

Install IntelliJ.

Usage with Ollama

<Note> To use **Ollama**, you will need a [JetBrains AI Subscription](https://www.jetbrains.com/ai-ides/buy/?section=personal&billing=yearly). </Note>
  1. In Intellij, click the chat icon located in the right sidebar
<div style={{ display: 'flex', justifyContent: 'center' }}> </div>
  1. Select the current model in the sidebar, then click Set up Local Models
<div style={{ display: 'flex', justifyContent: 'center' }}> </div>
  1. Under Third Party AI Providers, choose Ollama
  2. Confirm the Host URL is http://localhost:11434, then click Ok
  3. Once connected, select a model under Local models by Ollama
<div style={{ display: 'flex', justifyContent: 'center' }}> </div>