Record real OpenAI, Anthropic, Google, and Tavily responses once. Replay them instantly in every test run — no tokens, no latency, no flaky CI. Some call it AIFixture. Some call it ReplayAI. Developers call it their secret weapon.
# Replay a recorded OpenAI fixture — instant, zero cost curl -X POST https://api.getbrains4ai.com/api/aimock/replay \ -H "X-AIMock-Key: amk_your_key_here" \ -H "Content-Type: application/json" \ -d '{ "provider": "openai", "request": { "model": "gpt-4o", "messages": [{"role": "user", "content": "Summarize this contract"}] } }' # Response: exact fixture match, <5ms { "ok": true, "fixture": { "response_payload": { ... }, "latency_ms": 312 }, "_meta": { "calls_this_month": 1, "monthly_limit": 1000 } }
Three steps. Your tests run in milliseconds instead of seconds, with zero API spend.
Make one real call to OpenAI, Anthropic, or another provider via the admin API. AIMock stores the full response as a fixture keyed to the request shape.
In CI/CD or local dev, swap your real API calls for
POST /api/aimock/replay. Get the exact same response in
<5ms without any provider credentials.
When provider APIs change, AIMock's drift detector compares live vs. stored responses and alerts you before your production code breaks.
Simulate timeouts, rate limits, auth failures, and malformed responses to verify your retry logic handles real-world edge cases correctly.
All four providers in Storm's active AI stack — fixtures work across all models within each provider.
Base URL:
https://api.getbrains4ai.com/api/aimock
Free tier covers most solo developers and small teams. Contact us for a Pro key.