
A JetBrains extension providing access to state-of-the-art LLMs, such as GPT-4, Claude 3, Code Llama, and others, all for free
Table of Contents
CodeGPT is your go-to AI coding assistant, offering assistance throughout your entire software development journey while keeping privacy in mind. Access state-of-the-art large language models from leading providers such as OpenAI, Anthropic, Azure, Mistral, and others, or connect to a locally hosted model for a completely offline and transparent development experience.
Leveraging large language models, CodeGPT offers a wide range of features to enhance your coding experience, including, but not limited to:
Receive single-line or whole-function autocomplete suggestions as you type.
Note: Currently supported only on GPT-3.5 and locally-hosted models.
Get instant coding advice through a ChatGPT-like interface. Ask questions, seek explanations, or get guidance on your projects without leaving your IDE.
CodeGPT also supports vision models and image understanding, allowing you to attach images for more context-aware assistance. It can detect new screenshots automatically, saving you time by eliminating the need to manually upload images each time you take a screenshot.
CodeGPT can generate meaningful commit messages based on the changes made in your codebase. It analyzes the diff of your staged changes and suggests concise and descriptive commit messages, saving you time and effort.
CodeGPT allows you to reference specific files or documentation during your chat sessions, ensuring that responses are always relevant and accurate.
Stuck on naming a method or variable? CodeGPT offers context-aware suggestions, helping you adhere to best practices and maintain readability in your codebase.
CodeGPT supports a completely offline development workflow by allowing you to connect to a locally hosted language model. This ensures that your code and data remain private and secure within your local environment, eliminating the need for an internet connection or sharing sensitive information with third-party servers.
-
Download the Plugin
-
Choose Your Preferred Service:
a) OpenAI - Requires authentication via OpenAI API key.
b) Azure - Requires authentication via Active Directory or API key.
c) Custom OpenAI-compatible service - Choose between multiple different providers, such as Together, Anyscale, Groq, Ollama and many more.
d) Anthropic - Requires authentication via API key.
e) You.com - A free, web-connected service with an optional upgrade to You⚡Pro for enhanced features.
f) LLaMA C/C++ Port - Recommended to have a decent computer to handle the computational requirements of running inference.
Note: Currently supported only on Linux and MacOS.
-
Start Using the Features
The plugin is available from JetBrains Marketplace.
You can install it directly from your IDE via the File | Settings/Preferences | Plugins
screen.
On the Marketplace
tab simply search for codegpt
and select the CodeGPT
suggestion:
After successful installation, configure your API key. Navigate to the plugin's settings via File | Settings/Preferences | Tools | CodeGPT. Paste your OpenAI API key into the field and click Apply/OK
.
For Azure OpenAI services, you'll need to input three additional fields:
- Resource name: The name of your Azure OpenAI Cognitive Services. It's the first part of the url you're provided to use the service: "https://my-resource-name.openai.azure.com/". You can find it in your Azure Cognitive Services page, under
Resource Management
→Resource Management
→Keys and Endpoints
. - Deployment ID: The name of your Deployment. You can find it in the Azure AI Studio, under
Management
→Deployment
→Deployment Name
column in the table. - API version: The most recent non-preview version.
In addition to these, you need to input one of the two API Keys provided, found along with the Resource Name
.
You.com is a search engine that summarizes the best parts of the internet for you, with private ads and with privacy options.
You⚡Pro
Use the CodeGPT coupon for a free month of unlimited GPT-4 usage.
Check out the full feature list for more details.
Note: Currently supported only on Linux and MacOS.
The main goal of llama.cpp
is to run the LLaMA model using 4-bit integer quantization on a MacBook.
-
Select the Model: Depending on your hardware capabilities, choose the appropriate model from the provided list. Once selected, click on the
Download Model
link. A progress bar will appear, indicating the download process. -
Start the Server: After successfully downloading the model, initiate the server by clicking on the
Start Server
button. A status message will be displayed, indicating that the server is starting up. -
Apply Settings: With the server running, you can now apply the settings to start using the features. Click on the
Apply/OK
button to save your settings and start using the application.
Note: If you're already running a server and wish to configure the plugin against that, then simply select the port and click
Apply/OK
.
Linux or macOS
git clone https://github.com/carlrobertoh/CodeGPT.git
cd CodeGPT
git submodule update
./gradlew runIde
Windows ARM64
./gradlew runIde -Penv=win-arm64
Tailing logs
tail -f build/idea-sandbox/system/log/idea.log
Your data stays yours. CodeGPT does not collect or store any kind of sensitive information.
However, with users' consent, we do collect anonymous usage data, which we use to understand how users interact with the extension, including the most-used features and preferred providers.
Your input helps us grow. Reach out through:
Apache 2.0 © Carl-Robert Linnupuu
If you found this project interesting, kindly rate it on the marketplace and don't forget to give it a star. Thanks again!