Skip to content

OpenAI-Compatible API Gateway for AWS Bedrock and AI services

Deploy any OpenAI-compatible application on AWS Bedrock—no code changes required. Access 80+ models from Claude, Llama, Nova, and more with enterprise-grade privacy, compliance controls, and AWS pricing.

  • Production-ready OpenAI API compatibility
    Full support for chat, embeddings, images, audio (speech/transcription/translation), and more. Drop-in replacement for OpenAI SDK—works with LangChain, Continue.dev, Open WebUI, n8n, and 1000+ tools.

  • Purpose-built for AWS Bedrock
    Advanced features like prompt caching, reasoning modes, guardrails, prompt routers, and application inference profiles. Automatic region optimization and S3 integration included.

  • Enterprise compliance & data sovereignty
    Configure allowed AWS regions to meet your compliance requirements. All inference stays in your AWS account—data never shared with model providers or used for training.

  • AWS direct pricing, no markup
    Pay-per-use pricing with no subscriptions. Pay only AWS Bedrock rates for exactly what you use—no monthly minimums or capacity commitments.

  • Access to 80+ leading models
    Claude 4.6+ (reasoning), Nova 2, Llama 4, DeepSeek v3.2, Stable Diffusion, Mistral, Gemini, and more. Switch models instantly without code changes—no vendor lock-in.

  • Built-in observability & security
    OpenTelemetry integration, detailed request logging, API keys in AWS Systems Manager. CORS, proxy headers, SSRF protection, and hardened container images.

Models and AI services at your fingertips

How It Works

1. Deploy to AWS in minutes Launch via Terraform module on ECS, or run the Docker image locally for development. Production-ready infrastructure included.

2. Point your application to stdapi.ai Change only the base_url in your OpenAI client. All existing code, prompts, and workflows continue working.

3. Access AWS Bedrock models immediately Use Claude, Nova, Llama, or any Bedrock model. Switch between models, regions, and providers without changing application code.

Zero lock-in: Standard OpenAI API means you can switch back or to another provider anytime.

Enterprise-Grade Features

  • Multi-region Bedrock access
    Automatic cross-region inference profile selection for optimal availability and pricing

  • Advanced model capabilities
    Reasoning modes (Claude 4.6+, Nova 2), prompt caching, guardrails, service tiers

  • Complete API coverage
    Chat, embeddings, image generation/editing, audio speech/transcription/translation

  • AWS AI services integration
    Amazon Polly (TTS), Transcribe (STT with diarization), Translate—unified under OpenAI API

  • Observability & debugging
    OpenTelemetry, request/response logging, Swagger/ReDoc interfaces

  • Secure by default
    API keys in Systems Manager, CORS controls, SSRF protection, hardened containers

Who Uses stdapi.ai

  • DevOps & Platform Teams
    Deploy Open WebUI, LibreChat, or custom chat interfaces for your organization. Unified API gateway for all AI services—no per-application AWS integration needed.

  • Developers & AI Engineers
    Use Claude, Kimi K2 thinking, and Qwen Coder Next in VS Code (Continue.dev, Cline, Cursor), JetBrains IDEs, or any OpenAI-compatible tool. Test locally with Docker, deploy to production with Terraform.

  • Workflow Automation Teams
    Connect n8n, Make, Zapier, or custom automation to AWS Bedrock. Access 400+ integrations with enterprise-grade AI—all through one API endpoint.

  • Enterprises with Compliance Needs
    Meet data sovereignty requirements with region controls. GDPR, HIPAA, FedRAMP workloads supported through AWS Bedrock's compliance certifications.

  • Cost-conscious Organizations
    Switch from subscription-based AI services to pay-per-use AWS Bedrock pricing. Pay only for actual usage with no monthly commitments while accessing leading models (Claude, Stable Diffusion, Llama).

  • Teams Migrating from OpenAI
    LangChain, LlamaIndex, Haystack, or custom apps work immediately. Gradual migration supported—run both APIs in parallel during transition.

Ready to get started?

Community edition: Free Docker image for local development
Production: Terraform module with ECS + hardened container (via AWS Marketplace)