Skip to content
This repository was archived by the owner on Jul 23, 2025. It is now read-only.

Commit 61c7023

Browse files
danbarryrobla
andauthored
Add docs for Aider support (#30)
Co-authored-by: Yolanda Robla Mota <[email protected]>
1 parent 9d6985d commit 61c7023

File tree

6 files changed

+219
-14
lines changed

6 files changed

+219
-14
lines changed

docs/about/changelog.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,13 @@ Major features and changes are noted here. To review all updates, see the
1111

1212
:::
1313

14+
Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)
15+
16+
- **Aider support** - 13 Jan, 2025\
17+
CodeGate version 0.1.6 adds support for [Aider](https://aider.chat/), an LLM
18+
pair programmer in your terminal. See the
19+
[how-to guide](../how-to/use-with-aider.mdx) to learn more.
20+
1421
- **Semantic versioning for container image** - 8 Jan, 2025\
1522
Starting with v0.1.4, the CodeGate container image is published with semantic
1623
version tags corresponding to

docs/about/faq.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,8 @@ sidebar_position: 10
1010
No, CodeGate works _with_ your AI code assistant, as a local intermediary
1111
between your client and the LLM it's communicating with.
1212

13-
### Does CodeGate work with any plugins other than Copilot and Continue?
13+
### Does CodeGate work with any other IDE plugins or coding assistants?
1414

15-
Currently, CodeGate works with GitHub Copilot and Continue. We are actively
16-
exploring additional integrations based on user feedback.
15+
We are actively exploring additional integrations based on user feedback.
16+
[Join the community on Discord](https://discord.gg/stacklok) to let us know
17+
about your favorite AI coding tool!

docs/how-to/install.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ application settings, see [Configure CodeGate](./configure.md)
4242

4343
### Alternative run commands {#examples}
4444

45-
Run with minimal functionality for use with **Continue**:
45+
Run with minimal functionality for use with **Continue** or **Aider**:
4646

4747
```bash
4848
docker run -d -p 8989:8989 -p 9090:9090 --restart unless-stopped ghcr.io/stacklok/codegate:latest
@@ -152,6 +152,7 @@ Now that CodeGate is running, proceed to configure your IDE integration.
152152

153153
- [Use CodeGate with GitHub Copilot](./use-with-copilot.mdx)
154154
- [Use CodeGate with Continue](./use-with-continue.mdx)
155+
- [Use CodeGate with Aider](./use-with-aider.mdx)
155156

156157
## Remove CodeGate
157158

@@ -160,3 +161,4 @@ integration:
160161

161162
- [Remove CodeGate - GitHub Copilot](./use-with-copilot.mdx#remove-codegate)
162163
- [Remove CodeGate - Continue](./use-with-continue.mdx#remove-codegate)
164+
- [Remove CodeGate - Aider](./use-with-aider.mdx#remove-codegate)

docs/how-to/use-with-aider.mdx

Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
---
2+
title: Use CodeGate with Aider
3+
description: Configure the Aider for CodeGate
4+
sidebar_label: Use with Aider
5+
sidebar_position: 90
6+
---
7+
8+
import AiderProviders from '../partials/_aider-providers.mdx';
9+
10+
[Aider](https://aider.chat/) is an open source AI coding assistant that lets you
11+
pair program with LLMs in your terminal.
12+
13+
CodeGate works with the following AI model providers through Aider:
14+
15+
- Local / self-managed:
16+
- [Ollama](https://ollama.com/)
17+
- Hosted:
18+
- [OpenAI](https://openai.com/api/)
19+
20+
:::note
21+
22+
This guide assumes you have already installed Aider using their
23+
[installation instructions](https://aider.chat/docs/install.html).
24+
25+
:::
26+
27+
## Configure Aider to use CodeGate
28+
29+
To configure Aider to send requests through CodeGate:
30+
31+
<AiderProviders />
32+
33+
## Verify configuration
34+
35+
To verify that you've successfully connected Aider to CodeGate, type
36+
`/ask codegate-version` into the Aider chat in your terminal. You should receive
37+
a response like "CodeGate version 0.1.0":
38+
39+
## Next steps
40+
41+
Learn more about CodeGate's features:
42+
43+
- [Access the dashboard](./dashboard.md)
44+
- [CodeGate features](../features/index.mdx)
45+
46+
## Remove CodeGate
47+
48+
If you decide to stop using CodeGate, follow these steps to remove it and revert
49+
your environment.
50+
51+
1. Stop Aider and unset the environment variables you set during the
52+
configuration process:
53+
54+
**OpenAI:** `unset OPENAI_API_BASE` (macOS/Linux) or
55+
`setx OPENAI_API_BASE ""` (Windows)
56+
57+
**Ollama:** `unset OLLAMA_API_BASE` (macOS/Linux) or
58+
`setx OLLAMA_API_BASE ""` (Windows)
59+
60+
1. Re-launch Aider.
61+
62+
1. Stop and remove the CodeGate container:
63+
64+
```bash
65+
docker stop codegate && docker rm codegate
66+
```
67+
68+
1. If you launched CodeGate with a persistent volume, delete it to remove the
69+
CodeGate database and other files:
70+
71+
```bash
72+
docker volume rm codegate_volume
73+
```

docs/index.md

Lines changed: 12 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -37,25 +37,27 @@ CodeGate supports several development environments and AI providers.
3737

3838
AI coding assistants / IDEs:
3939

40-
- **[GitHub Copilot](https://github.com/features/copilot)** with Visual Studio
41-
Code
40+
- **[GitHub Copilot](./how-to/use-with-copilot.mdx)** with Visual Studio Code
41+
(JetBrains coming soon!)
4242

43-
- **[Continue](https://www.continue.dev/)** with Visual Studio Code and
43+
- **[Continue](./how-to/use-with-continue.mdx)** with Visual Studio Code and
4444
JetBrains IDEs
4545

4646
CodeGate supports the following AI model providers with Continue:
4747

4848
- Local / self-managed:
49-
- [Ollama](https://ollama.com/)
50-
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
51-
- [vLLM](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html)
49+
- Ollama
50+
- llama.cpp
51+
- vLLM
5252
- Hosted:
53-
- [OpenRouter](https://openrouter.ai/)
54-
- [Anthropic](https://www.anthropic.com/api)
55-
- [OpenAI](https://openai.com/api/)
53+
- OpenRouter
54+
- Anthropic
55+
- OpenAI
56+
57+
**[Aider](./how-to/use-with-aider.mdx)** with Ollama and OpenAI
5658

5759
As the project evolves, we plan to add support for more IDE assistants and AI
58-
models.
60+
model providers.
5961

6062
## How to get involved
6163

docs/partials/_aider-providers.mdx

Lines changed: 120 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,120 @@
1+
import Tabs from '@theme/Tabs';
2+
import TabItem from '@theme/TabItem';
3+
4+
<Tabs groupId="aider-provider">
5+
<TabItem value="openai" label="OpenAI" default>
6+
7+
You need an [OpenAI API](https://openai.com/api/) account to use this provider.
8+
9+
Before you run Aider, set environment variables for your API key and to set the
10+
API base URL to CodeGate's API port. Alternately, use one of Aider's other
11+
[supported configuration methods](https://aider.chat/docs/config/api-keys.html)
12+
to set the corresponding values.
13+
14+
<Tabs groupId="os">
15+
<TabItem value="macos" label="macOS / Linux" default>
16+
17+
```bash
18+
export OPENAI_API_KEY=<YOUR_API_KEY>
19+
export OPENAI_API_BASE=http://localhost:8989/openai
20+
```
21+
22+
:::note
23+
24+
To persist these variables, add them to your shell profile (e.g., `~/.bashrc` or
25+
`~/.zshrc`).
26+
27+
:::
28+
29+
</TabItem>
30+
<TabItem value="windows" label="Windows">
31+
32+
```bash
33+
setx OPENAI_API_KEY <YOUR_API_KEY>
34+
setx OPENAI_API_BASE http://localhost:8989/openai
35+
```
36+
37+
:::note
38+
39+
Restart your shell after running `setx`.
40+
41+
:::
42+
43+
</TabItem>
44+
</Tabs>
45+
46+
Replace `<YOUR_API_KEY>` with your
47+
[OpenAI API key](https://platform.openai.com/api-keys).
48+
49+
Then run `aider` as normal. For more information, see the
50+
[Aider docs for connecting to OpenAI](https://aider.chat/docs/llms/openai.html).
51+
52+
</TabItem>
53+
<TabItem value="ollama" label="Ollama">
54+
55+
You need Ollama installed on your local system with the server running
56+
(`ollama serve`) to use this provider.
57+
58+
CodeGate connects to `http://host.docker.internal:11434` by default. If you
59+
changed the default Ollama server port or to connect to a remote Ollama
60+
instance, launch CodeGate with the `CODEGATE_OLLAMA_URL` environment variable
61+
set to the correct URL. See [Configure CodeGate](/how-to/configure.md).
62+
63+
Before you run Aider, set the Ollama base URL to CodeGate's API port using an
64+
environment variable. Alternately, use one of Aider's other
65+
[supported configuration methods](https://aider.chat/docs/config/api-keys.html)
66+
to set the corresponding values.
67+
68+
<Tabs groupId="os">
69+
<TabItem value="macos" label="macOS / Linux" default>
70+
71+
```bash
72+
export OLLAMA_API_BASE=http://localhost:8989/ollama
73+
```
74+
75+
:::note
76+
77+
To persist this setting, add it to your shell profile (e.g., `~/.bashrc` or
78+
`~/.zshrc`) or use one of Aider's other
79+
[supported configuration methods](https://aider.chat/docs/config/api-keys.html).
80+
81+
:::
82+
83+
</TabItem>
84+
<TabItem value="windows" label="Windows">
85+
86+
```bash
87+
setx OLLAMA_API_BASE http://localhost:8989/ollama
88+
```
89+
90+
:::note
91+
92+
Restart your shell after running `setx`.
93+
94+
:::
95+
96+
</TabItem>
97+
</Tabs>
98+
99+
Then run Aider:
100+
101+
```bash
102+
aider --model ollama/<MODEL_NAME>
103+
```
104+
105+
Replace `<MODEL_NAME>` with the name of a coding model you have installed
106+
locally using `ollama pull`.
107+
108+
We recommend the [Qwen2.5-Coder](https://ollama.com/library/qwen2.5-coder)
109+
series of models. Our minimum recommendation for quality results is the 7
110+
billion parameter (7B) version, `qwen2.5-coder:7b`.
111+
112+
This model balances performance and quality for typical systems with at least 4
113+
CPU cores and 16GB of RAM. If you have more compute resources available, our
114+
experimentation shows that larger models do yield better results.
115+
116+
For more information, see the
117+
[Aider docs for connecting to Ollama](https://aider.chat/docs/llms/ollama.html).
118+
119+
</TabItem>
120+
</Tabs>

0 commit comments

Comments
 (0)