Data Sovereignty & Compliance¶
stdapi.ai is deployed entirely within your AWS account. All AI model inference, data storage, and service calls are performed within the AWS regions you explicitly configure — no data is ever sent to third-party services or leaves your account without your control.
-
Region-Locked Processing
Every AWS service call (Bedrock, S3, Polly, Transcribe, Comprehend, Translate) is restricted to your configured regions -
No Third-Party Egress
The application communicates exclusively with AWS services — no external APIs, telemetry endpoints, or third-party services -
Data in Transit Encrypted
All external communication uses TLS 1.2+; ALB supports TLS 1.3 and post-quantum hybrid key exchange -
No Persistent State on Compute
ECS containers hold no user data — all persistent storage lives in your S3 buckets, encrypted at rest
AWS Compliance Certifications
All AWS services used by stdapi.ai (Bedrock, Polly, Transcribe, Comprehend, Translate) are in scope for ISO 27001/27017/27018 and the full AWS ISO certification suite. Amazon Bedrock additionally covers SOC 1/2/3, HIPAA, GDPR, FedRAMP (Moderate and High), PCI-DSS, and CSA STAR Level 2. Amazon Comprehend and Polly are also HIPAA-eligible.
See AWS Compliance Programs for the full list, and AWS Services in Scope to verify current certifications per service.
Application¶
stdapi.ai communicates exclusively with the following AWS services, all within the regions you configure:
- Amazon Bedrock — model inference
- Amazon S3 — temporary file storage for multimodal inputs and outputs
- Amazon Polly — text-to-speech (when used)
- Amazon Transcribe — speech-to-text (when used)
- Amazon Comprehend — language detection (when used)
- Amazon Translate — text translation (when used)
- Amazon CloudWatch — logs and metrics
No other outbound network calls are made. The application does not contact any third-party API, telemetry service, or external endpoint.
Data in Transit¶
| Connection | Protocol | Notes |
|---|---|---|
| Client → ALB | HTTPS (TLS 1.2 / TLS 1.3) | ALB HTTPS listener; supports TLS 1.3 and post-quantum hybrid key exchange (ALB security policies) |
| ALB → ECS container | HTTP | Private VPC traffic, isolated within AWS network infrastructure |
| ECS → AWS services | HTTPS (TLS 1.2+) | AWS confirms: "Within AWS, all inter-network data in transit supports TLS 1.2 encryption" (source) |
ALB supports TLS 1.3 and post-quantum hybrid key exchange (ML-KEM / Kyber combined with a classical algorithm), so the session key is secure even against a future quantum adversary. See ALB security policies.
Stateless Design¶
ECS containers hold no user data between requests. All persistent data (multimodal inputs, async job outputs) is stored exclusively in your S3 buckets. When a container is stopped or replaced, no user data is lost.
AWS Bedrock¶
Data Privacy¶
AWS explicitly guarantees, for all Bedrock models without exception (AWS Bedrock data protection):
"Amazon Bedrock doesn't store or log your prompts and completions. Amazon Bedrock doesn't use your prompts and completions to train any AWS models and doesn't distribute them to third parties."
This guarantee is enforced by the Model Deployment Account architecture: for each model provider, AWS maintains isolated accounts where model inference runs. AWS confirms:
"Model providers don't have any access to those accounts. [...] Because the model providers don't have access to those accounts, they don't have access to Amazon Bedrock logs or to customer prompts and completions."
This means that regardless of the geographic origin of a model, inference runs on AWS-owned infrastructure and your prompts never reach the model provider.
Encryption at Rest and in Transit¶
From the AWS Bedrock FAQs:
"Your data in Amazon Bedrock is always encrypted in transit and at rest, and you can optionally encrypt the data using your own keys."
See KMS Encryption below for how stdapi.ai handles encryption at the infrastructure level.
Cross-Region Inference Profiles and Data Geography¶
When cross-region inference is enabled, Bedrock may route a request to another region within the inference profile's scope. AWS defines two types of profiles (source):
| Profile type | Example ID prefix | Geography |
|---|---|---|
| Geography-pinned (US, EU, APAC) | us., eu., ap. |
Fixed destination list — never changes, guaranteed to stay within the named geography |
| Global | No prefix (direct model ID) | May route to any AWS commercial region worldwide |
AWS explicitly states:
"if an inference profile is tied to a geography (such as US, EU, or APAC), its destination Region list will never change."
Set AWS_BEDROCK_CROSS_REGION_INFERENCE_GLOBAL=false to prevent Bedrock from using global profiles. Geography-pinned profiles (us.*, eu.*, ap.*) remain available and provide resilience within their geography. See Region-Specific Configuration for examples.
Model Providers and Data Access¶
All third-party models available through Bedrock are subject to the same Model Deployment Account architecture: the provider's software runs in AWS-owned, AWS-operated accounts that the provider cannot access. Your prompts and completions are never shared with any model provider, regardless of where that provider is headquartered.
This applies to providers from every geography, for example:
Alibaba Cloud (China 🇨🇳) — Qwen models
Amazon (United States 🇺🇸) — Nova and Titan models
Anthropic (United States 🇺🇸) — Claude models
Cohere (Canada 🇨🇦) — Command and Embed models
DeepSeek (China 🇨🇳) — DeepSeek models
Meta (United States 🇺🇸) — Llama models
MiniMax (China 🇨🇳) — MiniMax models
Mistral AI (France 🇫🇷) — Mistral models
Moonshot AI (China 🇨🇳) — Kimi models
Stability AI (United Kingdom 🇬🇧) — Stable Diffusion models
Writer (United States 🇺🇸) — Palmyra models
AWS AI Services¶
Amazon Polly, Transcribe, Comprehend, and Translate each run in an independently configurable region. By default all four services use the first region in AWS_BEDROCK_REGIONS, so pointing AWS_BEDROCK_REGIONS to your target geography is usually sufficient.
AWS AI Service Improvement Opt-Out¶
Unlike Amazon Bedrock, AWS may use content processed by Polly, Transcribe, Comprehend, and Translate to improve service quality by default. AWS confirms:
"AWS AI services may use and store customer content for service improvement, such as fixing operational issues, evaluating service performance, debugging, or model training."
You can prevent this by configuring an AI services opt-out policy at the AWS Organizations level. This is a one-time configuration that takes only a few minutes in the AWS Console, and applies to your entire AWS account or organisational unit immediately. When you opt out, AWS also deletes previously stored content:
"When you opt out of content use by an AWS AI service, that service deletes all of the associated historical content that was shared with AWS before you set the option."
S3 Data Storage¶
S3 is used as temporary storage for multimodal content (images, PDFs, audio files) passed to or returned from Bedrock and the AI services. Data is stored only in buckets you own and configure.
- Primary bucket (
AWS_S3_BUCKET) — must reside in the same AWS region as the first entry inAWS_BEDROCK_REGIONS. - Regional buckets (
AWS_S3_REGIONAL_BUCKETS) — for multi-region deployments, configure one bucket per Bedrock region so each region reads and writes data locally. - Lifecycle policies — the Terraform module applies a 1-day lifecycle policy to the temporary prefix as a failover safeguard. The application itself removes temporary files as soon as the operation completes, so files rarely remain beyond the duration of a single request.
S3 stores data within the AWS region where each bucket is created. Data does not leave that region unless you explicitly configure replication.
Logging¶
Application Logging¶
By default, stdapi.ai logs only request metadata — HTTP method, path, status code, execution time, and model identifier. Prompt and response content are never written to logs unless explicitly enabled.
Setting LOG_REQUEST_PARAMS=true enables full request/response payload logging. This is disabled by default and should remain disabled in production environments handling sensitive data. See Logging & Monitoring for details.
AWS Bedrock Invocation Logging¶
Bedrock optionally supports invocation logging — recording model inputs and outputs to S3 or CloudWatch Logs. This feature is disabled by default on AWS. When enabled:
- S3 destination: objects are encrypted using SSE-KMS (CMK supported via key policy).
- CloudWatch destination: log group can be encrypted with a KMS CMK.
- Logging scope is configurable: metadata only, or including full prompt and completion content.
See Amazon Bedrock invocation logging for configuration details.
KMS Encryption¶
All data stored by stdapi.ai is encrypted at rest. The Terraform module automatically creates a Customer Managed Key (CMK) with a restrictive key policy and automatic annual rotation enabled.
For higher compliance needs, AWS KMS supports additional controls: custom key policies, crypto-shredding, multi-region keys, and CloudHSM-backed custom key stores for FIPS 140-3 Level 3 hardware-validated key storage.
Bring your own CMK
To use an existing CMK, create the S3 bucket and CloudWatch log groups outside the Terraform module and pass them via aws_s3_bucket and related parameters. See Advanced Deployment.
Compliance Configuration Reference¶
| Variable | Purpose | Compliance relevance |
|---|---|---|
AWS_BEDROCK_REGIONS |
Ordered list of Bedrock regions | Restrict model inference to a specific geography |
AWS_BEDROCK_CROSS_REGION_INFERENCE |
Enable Bedrock cross-region routing within configured regions | Set false to restrict inference to a single region |
AWS_BEDROCK_CROSS_REGION_INFERENCE_GLOBAL |
Allow Bedrock to route globally outside configured regions | Set false to enforce geographic boundaries |
AWS_S3_BUCKET |
Primary S3 bucket | Must be in your target region |
AWS_S3_REGIONAL_BUCKETS |
Per-region S3 buckets for multi-region setups | Prevent cross-region data transfer |
AWS_POLLY_REGION |
Polly service region | Pin to your target geography |
AWS_TRANSCRIBE_REGION |
Transcribe service region | Pin to your target geography |
AWS_TRANSCRIBE_S3_BUCKET |
S3 bucket for Transcribe audio files | Must be in the same region as AWS_TRANSCRIBE_REGION |
AWS_COMPREHEND_REGION |
Comprehend service region | Pin to your target geography |
AWS_TRANSLATE_REGION |
Translate service region | Pin to your target geography |
Region-Specific Configuration¶
To ensure all data processing stays within the European Union:
# Restrict Bedrock to EU regions
export AWS_BEDROCK_REGIONS=eu-west-1,eu-west-3,eu-central-1,eu-north-1
# Prevent Bedrock from using global profiles that could route outside the EU.
# Geography-pinned eu.* profiles are still used for failover within the EU.
export AWS_BEDROCK_CROSS_REGION_INFERENCE_GLOBAL=false
# S3 bucket in an EU region (must match the first entry in AWS_BEDROCK_REGIONS)
export AWS_S3_BUCKET=my-stdapi-eu-bucket
Ready-to-use Terraform example
Multi-region GDPR (EU) — getting_started_production_gdpr
To restrict all data processing to US AWS regions:
# Restrict Bedrock to US regions
export AWS_BEDROCK_REGIONS=us-east-1,us-west-2,us-east-2
# Prevent Bedrock from using global profiles that could route outside the US.
# Geography-pinned us.* profiles are still used for failover within the US.
export AWS_BEDROCK_CROSS_REGION_INFERENCE_GLOBAL=false
# S3 bucket in a US region
export AWS_S3_BUCKET=my-stdapi-us-bucket
Ready-to-use Terraform example
Multi-region US — getting_started_production_us
To restrict all data processing to Asia Pacific AWS regions:
# Restrict Bedrock to APAC regions
export AWS_BEDROCK_REGIONS=ap-northeast-1,ap-southeast-1,ap-southeast-2
# Prevent Bedrock from using global profiles that could route outside APAC.
# Geography-pinned ap.* profiles are still used for failover within APAC.
export AWS_BEDROCK_CROSS_REGION_INFERENCE_GLOBAL=false
# S3 bucket in an APAC region
export AWS_S3_BUCKET=my-stdapi-apac-bucket
US Law and Cloud Provider Obligations¶
AWS is incorporated and headquartered in the United States. Regardless of where your data is stored, AWS as a legal entity is subject to US law — including statutes with explicit extraterritorial reach.
CLOUD Act¶
The Clarifying Lawful Overseas Use of Data (CLOUD) Act, 2018 requires US-based cloud providers to produce customer data in response to a valid US law enforcement warrant or court order, even when the data is stored outside the United States. Content data requires a court-issued warrant establishing probable cause — the highest legal standard under US criminal procedure.
Key limits and AWS commitments (AWS CLOUD Act page):
- Challenge mechanism: AWS can file a motion to quash or modify requests that would require violating another country's laws, and preserves this right contractually.
- Customer notification: AWS's policy is to notify customers of government data requests unless prohibited by a court order (gag order).
- Zero cross-border content disclosures: AWS publicly reports that it has not disclosed enterprise or government content stored outside the US to the US government since it began reporting this metric in 2020.
- Technical barrier: The AWS Nitro System restricts all administrative access — including by AWS employees — validated by independent security audit (NCC Group). When customer-managed encryption keys are in use, AWS may be technically unable to hand over intelligible data.
FISA Section 702¶
Section 702 of the Foreign Intelligence Surveillance Act authorises US intelligence agencies to compel US-based electronic communication service providers to deliver communications of non-US persons located outside the United States — without a per-target warrant. This authority was reauthorised in April 2024 and has no provider challenge mechanism equivalent to the CLOUD Act.
The primary technical countermeasure is customer-managed encryption (CMK): if AWS cannot decrypt the data, intelligible content cannot be produced. This is distinct from contractual protections, which cannot override applicable US law.
Transfer Frameworks¶
EU — Data Privacy Framework and SCCs. The European Commission adopted the EU-US Data Privacy Framework (DPF) in July 2023, requiring proportionality constraints on US intelligence collection and establishing a Data Protection Review Court (DPRC) as a redress mechanism for EU individuals. AWS is certified under the DPF (AWS DPF page). A legal challenge by noyb is pending before the CJEU but has not resulted in invalidation as of this writing.
AWS also incorporates Standard Contractual Clauses (SCCs) into its Data Processing Addendum by default — providing a contractual transfer mechanism independent of the DPF. SCCs cannot override applicable US law, but they contractually bind AWS to notify customers of legal demands and challenge requests where legally permissible. Maintaining SCCs alongside the DPF is the recommended posture.
APAC and other regions. There is no unified bilateral transfer framework equivalent to the EU-US DPF for APAC jurisdictions. Enterprises typically rely on AWS contractual commitments, customer-managed encryption, and jurisdiction-specific legal analysis. AWS maintains local certifications (IRAP for Australia, ISMAP for Japan, K-ISMS for South Korea, MTCS for Singapore) relevant to respective regulatory environments.
Strongest available posture
Deploy in AWS regions within your target geography + customer-managed KMS keys (so AWS cannot technically access plaintext) + SCCs as a transfer mechanism independent of DPF validity. This combination provides both contractual and technical protections against third-party access.
Best Practices for High-Compliance Deployments¶
- Restrict all region settings to compliant regions —
AWS_BEDROCK_REGIONSis the primary control; all services default to it, and any optional per-service override must stay within your target geography. - Set
AWS_BEDROCK_CROSS_REGION_INFERENCE_GLOBAL=false— stdapi.ai will then automatically use geography-pinned inference profiles (us.*,eu.*,ap.*), ensuring cross-region failover never leaves the named geography. - Opt out of AWS AI service improvement via an AWS Organizations policy — one-time console action covering Polly, Transcribe, Comprehend, and Translate.
- Use a CMK with a restrictive key policy — the Terraform module creates one by default; this is also the primary technical countermeasure against CLOUD Act and FISA 702 demands (if AWS cannot decrypt the data, intelligible content cannot be produced). For stricter control: bring your own key, limit decrypt to the ECS task role, enable automatic rotation, crypto-shredding for right-to-erasure, or a CloudHSM-backed store for FIPS 140-3 Level 3.
- Confirm
LOG_REQUEST_PARAMSis disabled (the default) in production — prompt and response content will then never appear in application logs. If Bedrock invocation logging is enabled for audit purposes, configure a KMS CMK for the S3 or CloudWatch destination. - Use AWS PrivateLink for Bedrock and S3 to keep service calls off the public internet, and enable TLS 1.3 on the ALB to activate post-quantum hybrid key exchange.
- Enable AWS CloudTrail in all configured regions to monitor API activity across Bedrock, S3, KMS, and AI services.
- EU users: maintain SCCs alongside the DPF — SCCs (included by default in the AWS Data Processing Addendum) provide a transfer mechanism independent of DPF validity and contractually bind AWS to challenge requests and notify you.
Next Steps¶
- Advanced Deployment — Multi-region Terraform examples
- Resilience & Failover — Multi-region routing and infrastructure resilience
- Configuration Reference — Complete list of environment variables