A Terraform repository that deploys URL shorteners, AI agents, personal APIs, and email services — all from the same set of reusable building blocks.
Each module supports two modes: a simple single-resource interface for quick deploys, and a multi-resource mode with shared defaults for managing fleets. Same API shape, same patterns — just scale up when you need to.
Unified function deployment — zip archives, Docker containers, or pre-built images. Handles ECR pushes, layer builds, function URLs, and IAM automatically.
HTTP API v2 with declarative route maps. Point routes at Lambda functions by registry key, attach JWT authorizers, custom domains via Cloudflare, and throttling.
Tables with string shorthand for keys — write partition_key = "id" instead of a full attribute block. GSIs, streams with Lambda triggers, TTL, encryption.
Buckets with lifecycle rules, CORS, static website hosting, and directory uploads. Versioning with optional MFA delete, AES256 or KMS encryption.
ACM certificates that validate themselves. Create a cert, point it at Cloudflare or Route53, and the module waits until DNS propagation confirms ownership.
Model Context Protocol gateway for AI tool orchestration. Registers Lambda functions as MCP tools with schemas, supports semantic search and JWT auth.
Amazon Bedrock AgentCore deployment. Build Docker agents, push to ECR, configure memory and endpoints — all from a single module block.
Scheduled and event-driven Lambda execution. Cron expressions, rate schedules, event patterns — connected to the Lambda registry for clean references.
Need one Lambda? Pass function_name and handler. Need twelve? Use the lambdas map instead. The module detects which mode you're using and adjusts internally — no flags, no mode switches.
This works across all major modules: Lambda, DynamoDB, S3. The interface is always the same shape — singular noun for one, plural map for many.
module "my-function" {
source = "../modules/lambda"
function_name = "hello-world"
handler = "main.handler"
runtime = "python3.13"
source_dir = "./src"
}
module "functions" {
source = "../modules/lambda"
defaults = {
runtime = "python3.13"
base_path = "../../lambdas/"
}
lambdas = {
api-handler = { source_dir = "api" }
processor = { memory_size = 512 }
notifier = { timeout = 30 }
}
}
When the Lambda module creates functions, it outputs a functions map — keys are the names you gave them, values contain the ARN, invoke ARN, and function name.
Downstream modules like API Gateway and EventBridge consume this registry. Routes reference functions by key ("create"), not by pasting ARNs around. Add a new function to the map, add a route pointing at its key — done.
Every multi-mode module accepts a defaults block. These values apply to all resources unless overridden per-resource. The merge logic depends on type:
This means you set runtime = "python3.13" once in defaults, and only the one Node.js function needs to specify its own runtime.
The workspace wires modules together. Lambda functions feed into API Gateway via the registry, DynamoDB provides persistence, S3 handles storage, and the MCP Gateway + Bedrock Runtime power the AI layer.
Five Lambda functions handle create, get, list, delete, and availability checks. DynamoDB stores the mappings, API Gateway routes traffic with rate limiting at 5 req/s.
A grab-bag API: serve a CV PDF from S3, return the caller's IP and geolocation, fetch local weather, and generate ideas with Bedrock — including a streaming variant for long responses.
An MCP Gateway registers Canvas LMS tools for student queries. Five Docker-based agents run on Bedrock AgentCore — a personal chatbot, idea generator, study tutor, quiz and flashcard creators.
Handles contact form submissions for two domains. A Lambda processes the form, SES sends the email, and Bedrock analyzes the content — all behind API Gateway with CORS for specific origins.
Each agent is a Docker container built locally, pushed to ECR, and deployed as a Bedrock AgentCore runtime. They share the same Terraform module — the only differences are the Dockerfile and the system prompt.
Personal chatbot — knows context about the developer, answers questions conversationally
Multi-agent pipeline: Innovator generates ideas, Critic challenges them, Refiner polishes the survivors
Socratic learning — answers questions with questions, guides understanding instead of giving answers
Creates quizzes from course material, with varied question types and difficulty levels
Generates spaced-repetition flashcards from study content for efficient memorization
The entire infrastructure is declarative HCL. Version 1.5+ unlocked optional() in variable types, which made the defaults pattern possible without janky workarounds.
Pinned to the 6.x major to avoid breaking changes on terraform init. The tilde-arrow constraint allows patch updates automatically.
Most Lambda functions run Python. Different versions coexist because some depend on libraries that haven't caught up yet — the Canvas MCP server needs 3.12 for canvasapi compatibility.
The streaming API uses Node.js for its RESPONSE_STREAM invoke mode — Python's Lambda streaming support is less mature. Node 20.x is the primary runtime.
Lambda and Runtime modules build Docker images locally and push to ECR. The Docker provider handles image hashing so Terraform knows when to rebuild.
DNS validation for ACM certificates goes through Cloudflare. The certificate module creates CNAME records automatically and waits for validation.
Every feature goes through a specification phase before any Terraform gets written. The speckit workflow produces user stories, acceptance criteria, data models, and implementation tasks — all tracked in specs/<feature>/.
User stories and acceptance criteria. What does "done" look like?
Technical research — API limitations, provider quirks, alternative approaches.
Implementation strategy. Which files change, in what order, with what dependencies.
Numbered tasks with IDs: [X] T001 [P] [US1] Add variable in variables.tf
Tasks get executed, validated with terraform validate, and checked off.