Run OpenAI & Anthropic Apps on AWS Bedrock¶
Drop-in API gateway for AWS Bedrock and AI services. Your existing OpenAI and Anthropic applications work immediately—just change the base URL. Access 80+ models with enterprise privacy, compliance controls, and pay-per-use AWS pricing.
14-day free trial included — free for local development.
-
Change one line, access 80+ models
Drop-in replacement for OpenAI and Anthropic SDKs. Works with LangChain, Continue.dev, Open WebUI, n8n, OpenClaw, Claude SDK, and 1000+ tools—no code changes beyond the base URL. -
Your data stays in your AWS account
All inference runs in your account. Data is never shared with model providers or used for training. Configure allowed regions for GDPR, HIPAA, and FedRAMP compliance. -
Pay only for what you use
No subscriptions or monthly minimums. Pay AWS Bedrock rates directly with no markup from stdapi.ai. Pay only for what you actually use. -
Purpose-built for AWS Bedrock
Deep integration with prompt caching, reasoning modes, guardrails, service tiers, inference profiles, and prompt routers. Not a generic proxy—built to leverage every Bedrock feature. -
Claude, Kimi K2, MiniMax, and 80+ more
Claude 4.6+ (reasoning), Kimi K2, MiniMax M2.5, Qwen3, Llama 4, GLM 5, Nova 2, Stability AI, and more. Switch models instantly—no vendor lock-in. -
Deploy in 5 minutes
3 lines of Terraform for production on AWS. Or run Docker locally for development. Production-ready infrastructure with HTTPS, WAF, auto-scaling, and monitoring included.
Models and AI services at your fingertips¶
How It Works¶
%%{init: {'flowchart': {'htmlLabels': true}} }%%
flowchart LR
openai["<img src='styles/logo_openai.svg' style='height:64px;width:auto;vertical-align:middle;' /> OpenAI SDK Apps"] --> stdapi["<img src='styles/logo.svg' style='height:64px;width:auto;vertical-align:middle;' /> stdapi.ai"]
anthropic["<img src='styles/logo_anthropic.svg' style='height:64px;width:auto;vertical-align:middle;' /> Anthropic SDK Apps"] --> stdapi
stdapi --> bedrock["<img src='styles/logo_amazon_bedrock.svg' style='height:64px;width:auto;vertical-align:middle;' /> AWS Bedrock<br/>80+ models"]
stdapi --> services["<img src='styles/logo_amazon.svg' style='height:64px;width:auto;vertical-align:middle;' /> AWS AI Services<br/>Polly · Transcribe · Translate"]
1. Deploy to AWS with Terraform
module "stdapi_ai" {
source = "stdapi-ai/stdapi-ai/aws"
version = "~> 1.0"
}
Prefer a hands-off setup? A managed deployment service can deploy stdapi.ai into your AWS account.
2. Point your application to stdapi.ai
from openai import OpenAI
client = OpenAI(
api_key="your-api-key",
base_url="https://your-endpoint.com/v1" # ← only change
)
response = client.chat.completions.create(
model="anthropic.claude-opus-4-6-v1",
messages=[{"role": "user", "content": "Hello!"}]
)
from anthropic import Anthropic
client = Anthropic(
api_key="your-api-key",
base_url="https://your-endpoint.com/anthropic" # ← only change
)
message = client.messages.create(
model="anthropic.claude-opus-4-6-v1",
messages=[{"role": "user", "content": "Hello!"}]
)
3. Access any Bedrock model immediately Use Claude, Kimi K2, MiniMax, or any Bedrock model. Switch between models, regions, and providers without changing application code.
Zero lock-in: Standard OpenAI and Anthropic APIs mean you can switch back or to another provider anytime.
Built for AWS¶
-
Multiply your quota across regions
Each AWS region has its own independent quota. Configure 3 regions and get 3x the tokens per minute and 3x the daily limits. Automatic routing and failover—no client changes needed. -
Advanced Bedrock capabilities
Reasoning modes (Claude 4.6+, Nova 2), prompt caching, guardrails, service tiers, application inference profiles, and prompt routers—all through standard OpenAI API parameters. -
Complete multi-modal API
Chat completions, embeddings, image generation/editing/variations, audio speech/transcription/translation. Every route maps OpenAI parameters to Bedrock equivalents. -
Native AWS AI services
Amazon Polly (text-to-speech), Transcribe (speech-to-text with speaker diarization), Translate—all unified under OpenAI-compatible endpoints. -
Full observability
OpenTelemetry integration for traces and metrics. Detailed request/response logging. Swagger and ReDoc API documentation served by the application. -
Automatic deprecated model fallback
When AWS retires a model, requests are transparently redirected to its replacement. Your applications survive model deprecations without code changes.
Who Uses stdapi.ai¶
-
DevOps & Platform Teams
Deploy Open WebUI, LibreChat, or custom chat interfaces for your organization. Unified API gateway for all AI services—no per-application AWS integration needed.
Open WebUI guide · All use cases -
Developers & AI Engineers
Use Claude, Kimi K2 thinking, and Qwen Coder Next in VS Code (Continue.dev, Cline, Cursor), JetBrains IDEs, or any OpenAI-compatible tool. Test locally with Docker, deploy to production with Terraform.
Coding assistants guide -
Workflow Automation Teams
Connect n8n, Make, Zapier, or custom automation to AWS Bedrock. Access 400+ integrations with enterprise-grade AI—all through one API endpoint.
n8n integration guide -
Enterprises with Compliance Needs
Meet data sovereignty requirements with region controls. GDPR, HIPAA, FedRAMP workloads supported through AWS Bedrock's compliance certifications. -
Cost-conscious Organizations
Switch from subscription-based AI services to pay-per-use AWS Bedrock pricing. Pay only for actual usage with no monthly commitments while accessing leading models (Claude, Kimi K2, MiniMax, Qwen3). -
Teams Migrating from OpenAI or Anthropic
LangChain, LlamaIndex, Haystack, Claude SDK, or custom apps work immediately. Gradual migration supported—run both APIs in parallel during transition.
Choose Your Edition¶
-
Community Edition — Free & Open Source
Best for: Open-source projects, local development, testing, and evaluation.
- Full API compatibility and all features
- Community Docker image
- AGPL-3.0 license (source disclosure required for network use)
-
Commercial Edition — AWS Marketplace
Best for: Internal tools, SaaS products, proprietary applications, production.
- 14-day free trial — test in your environment risk-free
- Hardened container, security updates, commercial support
- No AGPL restrictions — keep your code and modifications private
- Terraform module for production-ready deployment in minutes
- Streamlined AWS billing