When we first launched Spectre, our goal was simple: make it easy for Rails developers to add AI to their apps without getting bogged down in APIs. With Spectre 2.0, we’re taking a big step forward.
Claude & Gemini Support
Spectre now works with Anthropic’s Claude and Google’s Gemini, alongside OpenAI and Ollama. You can mix-and-match models for the best fit, fallback, or comparisons.
Smarter Prompt Infrastructure
Managing prompts can get messy as apps grow. Spectre introduces nested directories, modular templates, and combined system/user prompts, making complex setups easier to maintain.
Consistent Chat & Completions
Different LLMs handle messages and output differently. Spectre standardizes these interactions so you get reliable behavior across providers.
Performance & Reliability
We’ve improved error handling, patched edge cases, and polished docs. It's the behind-the-scenes work that makes Spectre production-ready.
Different LLMs shine in different areas. With multi-model support, you’re never locked into one “voice.” Choose the right model for each use case: concise, empathetic, or imaginative.
Spectre 2.0 empowers Rails developers to build smarter, more reliable AI apps faster. With Claude and Gemini support, improved prompts, and better consistency, Spectre is becoming a true AI platform layer for Rails.
👉 Explore the repo and start building today.
The future of Rails + AI is here — flexible, reliable, and exciting!