Multi-Provider Support

"Cloud or local. OpenAI or Ollama. Unlimited providers. Your keys, your choice."

Demo Coming Soon

Use Any AI Provider

Don't be locked into one provider. FrankenCoder works with any OpenAI-compatible API, plus native support for major providers:

Cloud Providers:

• OpenAI (GPT-4, GPT-4o, o1, etc.)

• Anthropic (Claude 3.5, Claude 3)

• Google (Gemini Pro, Gemini Ultra)

• xAI (Grok)

• Any OpenAI-compatible API

Local Providers:

• fc-inference (built-in)

• Ollama

• vLLM

• LM Studio

• Text Generation WebUI

• Any local OpenAI-compatible server

Your API Keys

You bring your own API keys. This means:

Per-Tab Model Selection

Different tasks need different models:

Select the model per conversation tab. Mix and match freely.

FIFO Queue

Running multiple tabs with multiple models? The FIFO queue manages requests:

Unlimited Providers

Add as many providers as you want. There's no limit. Configure once, switch freely.

"Your only rate limit is your imagination."

Ready for Provider Freedom?

Join the private beta and use any AI you want.

Join Private Beta ← Explore More Features