"Cloud or local. OpenAI or Ollama. Unlimited providers. Your keys, your choice."
Don't be locked into one provider. FrankenCoder works with any OpenAI-compatible API, plus native support for major providers:
Cloud Providers:
• OpenAI (GPT-4, GPT-4o, o1, etc.)
• Anthropic (Claude 3.5, Claude 3)
• Google (Gemini Pro, Gemini Ultra)
• xAI (Grok)
• Any OpenAI-compatible API
Local Providers:
• fc-inference (built-in)
• Ollama
• vLLM
• LM Studio
• Text Generation WebUI
• Any local OpenAI-compatible server
You bring your own API keys. This means:
Different tasks need different models:
Select the model per conversation tab. Mix and match freely.
Running multiple tabs with multiple models? The FIFO queue manages requests:
Add as many providers as you want. There's no limit. Configure once, switch freely.
"Your only rate limit is your imagination."
Join the private beta and use any AI you want.
Join Private Beta ← Explore More Features