API Overviewยถ
stdapi.ai provides the OpenAI API interface backed by AWS Bedrock foundation models and AWS AI services. Use your existing OpenAI SDKs by simply changing the base URL.
Interactive Documentationยถ
stdapi.ai provides multiple interfaces for exploring and testing the APIโchoose the one that fits your workflow:
๐ Documentation Resourcesยถ
- Complete API Reference โ In-depth guides for every endpoint with parameter details
- OpenAPI Specification โ Full machine-readable schema for integration and tooling
๐ฎ Live API Playgroundยถ
When running the server, access these interactive interfaces:
| Interface | URL | Best For |
|---|---|---|
| Swagger UI | http://localhost/docs |
Testing endpoints directly in your browser with live request/response examples |
| ReDoc | http://localhost/redoc |
Reading and searching through clean, organized documentation |
| OpenAPI Schema | http://localhost/openapi.json |
Generating client code or importing into API tools like Postman |
OpenAI SDK-Compatible APIยถ
Access AWS-powered AI services through the OpenAI API interface. Use official OpenAI SDKs by simply changing the base URLโno custom clients needed.
Supported Endpoints:
| Category | Endpoint | Capability | Documentation |
|---|---|---|---|
| ๐ฌ Chat | POST /v1/chat/completions |
Multi-modal conversations with text, images, video, documents | Chat Completions โ |
| ๐จ Images | POST /v1/images/generations |
Text-to-image generation | Images โ |
| ๐ Audio | POST /v1/audio/speech |
Text-to-speech synthesis | Text to Speech โ |
POST /v1/audio/transcriptions |
Speech-to-text transcription | Speech to Text โ | |
POST /v1/audio/translations |
Speech-to-English translation | Speech to English โ | |
| ๐ง Embeddings | POST /v1/embeddings |
Vector embeddings for semantic search | Embeddings โ |
| ๐ Models | GET /v1/models |
List available models | Models โ |
- Backend: AWS Bedrock, AWS Polly, AWS Transcribe, AWS Translate
- SDKs: Official OpenAI SDKs (Python, Node.js, Go, Ruby, etc.)
Quick Start Guideยถ
This guide shows how to use stdapi.ai with OpenAI SDKs.
stdapi.ai works with official OpenAI client librariesโno custom clients needed:
Python
pip install openai
from openai import OpenAI
# Initialize client with your stdapi.ai endpoint
client = OpenAI(
base_url="https://your-deployment-url/v1",
api_key="your-api-key"
)
# Use AWS Bedrock models through the familiar OpenAI interface
response = client.chat.completions.create(
model="amazon.nova-micro-v1:0",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain quantum computing in simple terms."}
]
)
print(response.choices[0].message.content)
# Pass provider-specific parameter through `extra_body`:
response = client.chat.completions.create(
model="anthropic.claude-sonnet-4-5-20250929-v1:0",
messages=[{"role": "user", "content": "Solve this complex problem..."}],
extra_body={
"top_k": 0.5
}
)
Node.js
npm install openai
import OpenAI from 'openai';
// Initialize client with your stdapi.ai endpoint
const client = new OpenAI({
baseURL: 'https://your-deployment-url/v1',
apiKey: 'your-api-key'
});
// Use AWS Bedrock models through the familiar OpenAI interface
const response = await client.chat.completions.create({
model: 'amazon.nova-micro-v1:0',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Explain quantum computing in simple terms.' }
]
});
console.log(response.choices[0].message.content);
cURL
curl -X POST "https://your-deployment-url/v1/chat/completions" \
-H "Authorization: Bearer your-api-key" \
-H "Content-Type: application/json" \
-d '{
"model": "amazon.nova-micro-v1:0",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain quantum computing in simple terms."}
]
}'
Understanding Compatibilityยถ
What "Compatible" Meansยถ
stdapi.ai implements the API from source specification with AWS services as the backend. This means:
- โ Same Request Format: Use identical request bodies and parameters
- โ Same Response Structure: Receive responses in the expected format
- โ Same SDKs: Use official client libraries
- โ Drop-in Replacement: Change only the base URL and model ID
Handling Parameter Differencesยถ
Different providers support different features. stdapi.ai handles this gracefully:
Silent Ignoring (Safe Parameters)ยถ
Parameters that don't affect output are accepted but ignored:
# This worksโunsupported parameters are silently ignored
response = client.chat.completions.create(
model="amazon.nova-micro-v1:0",
messages=[{"role": "user", "content": "Hello"}],
user="user-123", # Logged but doesn't affect AWS processing
)
Clear Errors (Behavior-Changing Parameters)ยถ
Parameters that would change output return helpful errors:
# This returns HTTP 400 with a clear explanation
response = client.chat.completions.create(
model="amazon.nova-micro-v1:0",
messages=[{"role": "user", "content": "Hello"}],
logit_bias={123: 100}, # Not supportedโclear error returned
)
Understanding the Architectureยถ
โโโโโโโโโโโโโโโโโโโ
โ Your App โ
โ (OpenAI SDK) โ
โโโโโโโโโโฌโโโโโโโโโ
โ HTTPS /v1/*
โผ
โโโโโโโโโโโโโโโโโโโ
โ stdapi.ai โ
โ API Gateway โ
โโโโโโโโโโฌโโโโโโโโโ
โ Translates to AWS APIs
โโโโโโดโโโโโโฌโโโโโโโโโโโฌโโโโโโโโโโโ
โผ โผ โผ โผ
โโโโโโโโโโ โโโโโโโโโโ โโโโโโโโโโ โโโโโโโโโโ
โAWS โ โAWS โ โAWS โ โAWS โ
โBedrock โ โPolly โ โTranscr.โ โS3 โ
โโโโโโโโโโ โโโโโโโโโโ โโโโโโโโโโ โโโโโโโโโโ
Request Flow:
1. Your application uses the standard OpenAI SDK
2. Requests go to stdapi.ai's /v1/* endpoints instead of OpenAI
3. stdapi.ai translates OpenAI-format requests to AWS service APIs
4. AWS services process the requests
5. Responses are formatted as OpenAI-compatible responses
6. Your application receives familiar OpenAI response structures
Ready to build intelligent applications? Start with the Chat Completions API or explore available models!