Sign In

Deployment Models

Three deployment options: SaaS (Free and Sovereign), Sovereign dedicated infrastructure, and Enterprise SDK with air-gapped Virtual PLC Docker.

Deployment Overview

ModelPlansComputeData Location
SaaS (Shared)FreeFargate Spot (shared)RadMah AI AWS (us-east-1)
SaaS (Priority)SovereignStandard Fargate (priority)RadMah AI AWS (us-east-1)
Enterprise SDKEnterpriseCustomer infrastructureCustomer-controlled

SaaS Deployment (Free and Sovereign)

The standard deployment for Free and Sovereign plans. RadMah AI manages all infrastructure in AWS us-east-1. No customer infrastructure required.

Infrastructure Stack

  • API: FastAPI on ECS Fargate behind ALB
  • Workers: Celery on ECS Fargate (4 queue tiers)
  • Database: PostgreSQL (RDS)
  • Cache/Broker: Redis (ElastiCache)
  • Storage: S3 for artifacts and evidence bundles
  • GPU Jobs: AWS Batch (g4dn instances) for training
  • Secrets: AWS Secrets Manager
  • DNS: Route53
  • Frontend: AWS Amplify

Sovereign Queue Priority

Sovereign plan jobs run on Standard Fargate with priority over Free-tier jobs, which run on Fargate Spot. Sovereign concurrent job limit: 10 (vs. 1 for Free).

Enterprise SDK Deployment

Enterprise customers deploy the RadMah AI SDK and Virtual PLC Docker containers on their own infrastructure. Data never leaves the customer environment.

Complete Data Sovereignty

All generation, training, evidence production, and verification happen entirely within the customer environment. No telemetry or data is sent to RadMah AI.

Components Deployed On-Premise

  • RadMah AI SDK: Python SDK package (same as SaaS)
  • Virtual PLC Docker image: Container-isolated Virtual PLC simulation runtime
  • Cryptographic core module: Signed native module that performs evidence sealing and verification locally
  • Engine core: Full generation pipeline runtime for all supported engines

Enterprise Virtual PLC Runtime Requirements

  • Real-time-capable Linux container with Docker installed
  • Minimum 4 CPU cores, 8 GB RAM per PLC instance
  • Max 50 concurrent PLC instances per deployment

Air-Gapped Deployments

Fully Offline Operation

Enterprise SDK supports fully air-gapped operation. Evidence bundle verification is self-contained — all hashing and parsing runs locally inside the cryptographic core module, with no network calls required. See Hybrid SDK for air-gapped configuration.

Infrastructure as Code

SaaS infrastructure is fully managed via Terraform. The IaC covers:

  • VPC, subnets, NAT, ALB
  • ECS cluster, task definitions, services
  • RDS PostgreSQL with automated backups
  • ElastiCache Redis cluster
  • S3 buckets (artifacts, Terraform state)
  • AWS Batch compute environments and job queues
  • IAM roles with least-privilege policies
  • CloudWatch monitoring and alerting
  • Route53 DNS records
  • AWS Amplify frontend deployments

Region Policy

All resources are deployed exclusively in us-east-1 with consistent naming (RadMah AI prefix), tagging, and description policies.

SLA

PlanSLA
FreeNone
Sovereign99.9% uptime
EnterpriseCustom (negotiated)