Skip to main content
Photo of DeepakNess DeepakNess

Ollama local LLMs now work with Claude Code

Unproofread notes

Recently came across the post that Ollama now has Anthropic API compatibility, and now you can use Claude Code with local open-source LLMs. Here is an interesting reply on the same post.

Apart from this, here is some more information about Anthropic compatibility and Claude Code integration. And here is another blog post explaining Claude Code with Anthropic API compatibility.

Simon also tweeted about it and I also found this discussion on Reddit about the same.

I think, it would be interesting to see how these open-source models perform with Claude Code. I am yet to try this.

Comment via email