Running Open-Source AI Coding Assistants Aider and Claude Dev (2024)

John Maeda
4 min readSep 2, 2024

--

I tried out Open Source coding assistants Aider and Claude Dev. They’re both pretty cool, and this is how I got them running.

The frontend is exciting me again. Thanks, LLM AI!

I absolutely love GitHub Copilot. And I’ve been trying out Cursor (paid) to feel that energy as well. The open-source options have been getting cooler. So I tried them out today to figure out how to get started. It’s really easy!

To get Claude Dev running

Get the VS Code extension.

Search for this extension on VS Code.

Use the Cmd + Shift + P shortcut to open the command palette and type Claude Dev: Open In New Tab to start a new task right in the editor.

You’ll need a Claude platform key from Anthropic to run this.

It felt kind of amazing when I saw it telling me my $$$ runrate.

To build Claude Dev yourself

If you want to run it from source, you can just:

git clone https://github.com/saoudrizwan/claude-dev.git

Go to the directory and type

npm run install:all

Claude Dev only runs with Claude models. If you’re an OpenAI fan like me, you’ll think to yourself 🤔 …. And then you will turn your attention to Aider.

To get Aider running

Aider is for CLI-y people. That’s you. I think. Otherwise you wouldn’t still be reading this Medium post.

Get ready to pip …

I had to run pip which always gives me the shivers in case I don’t have the right version of Python or what-not. Gah.

python -m pip install aider-chat

OMG it worked. And then I sat in my repo and added my OAI key.

export OPENAI_API_KEY=sk-...
aider

Alternatively I can enter my Anthropic key.

export ANTHROPIC_API_KEY=sk-...
aider

To choose which model you want to run with, here’s the options for OAI:

# Aider uses gpt-4o by default (or use --4o)
aider

# GPT-4 Turbo (1106)
aider --4-turbo

# GPT-3.5 Turbo
aider --35-turbo

# List models available from OpenAI
aider --models openai/

Listing them out feels like comfort food.

You can do the same for anthropic models

These little shortcut flags like --4o and --sonnet are really cute.

[--opus] [--sonnet] [--4] [--4o] [--mini] [--4-turbo] [--35turbo] [--deepseek]

First run of Aider in my repo

From within my terminal in VS Code I’m warned that my output won’t be as cool.

Cost-wise my GPT-4o cost less than Anthropic to explain to me about my stuff. I can reduce the cost of Anthropic’s run by turning on the Aider cache and saying aider --cache-prompts --no-stream to make that happen.

From the Aider docs on caching this is neat:

Preventing cache expiration: Aider can ping the provider to keep your prompt cache warm and prevent it from expiring. By default, Anthropic keeps your cache for 5 minutes. Use --cache-keepalive-pings N to tell aider to ping every 5 minutes to keep the cache warm. Aider will ping up to N times over a period of N*5 minutes after each message you send.

You can also run Aider in a browser by typing:

aider --browser

My optimism started to wane when I ran into the typical complaint of not having the right pip etc.

These moments always get me nervous.

The browser window popped up, and I asked the same question of: “What does this code do?”

This was taking a little too long for me.

Okay, that’s just how this thing works. It’s always running. You’ll need to manually add files for the context.

Cool! But it doesn’t tell me how much I’m $$$ spending.

In settings you’re able to have it hot-reload the repo context.

Next steps

Okay! I think I’ve Aider’d and Claude Dev’d enough for today. I hope you feel a little smarter and open-source-y! I didn’t run either of these with a local model but I know I can if I wanted to stay longer. But need to go run back to the AI revolution. Cheers —JM

--

--

John Maeda
John Maeda

Written by John Maeda

John Maeda: Technologist and product experience leader that bridges business, engineering, design via working inclusively. Currently VP Eng, AI Platform @ MSFT

Responses (1)