Infrastructure-as-Code

Eight modules.
Two workspaces.
One composable system.

A Terraform repository that deploys URL shorteners, AI agents, personal APIs, and email services — all from the same set of reusable building blocks.

0 Modules
0 Examples
0 AI Agents
0 Workspaces
modules/8 modules
lambda/
api-gateway/
dynamodb/
s3/
gateway/
runtime/
workspace/2 envs
lambdas/4 projects
agents/5 agents
specs/8 specs

Every module speaks the same language

Each module supports two modes: a simple single-resource interface for quick deploys, and a multi-resource mode with shared defaults for managing fleets. Same API shape, same patterns — just scale up when you need to.

Lambda

Unified function deployment — zip archives, Docker containers, or pre-built images. Handles ECR pushes, layer builds, function URLs, and IAM automatically.

zipdockercontainerlayersvpc

API Gateway

HTTP API v2 with declarative route maps. Point routes at Lambda functions by registry key, attach JWT authorizers, custom domains via Cloudflare, and throttling.

routesjwtcorsdomains

DynamoDB

Tables with string shorthand for keys — write partition_key = "id" instead of a full attribute block. GSIs, streams with Lambda triggers, TTL, encryption.

gsistreamsttlpitr

S3

Buckets with lifecycle rules, CORS, static website hosting, and directory uploads. Versioning with optional MFA delete, AES256 or KMS encryption.

versioninglifecyclewebsitecors

API Gateway Certificate

ACM certificates that validate themselves. Create a cert, point it at Cloudflare or Route53, and the module waits until DNS propagation confirms ownership.

acmcloudflareroute53

MCP Gateway

Model Context Protocol gateway for AI tool orchestration. Registers Lambda functions as MCP tools with schemas, supports semantic search and JWT auth.

mcpai-toolsschemas

Bedrock Runtime

Amazon Bedrock AgentCore deployment. Build Docker agents, push to ECR, configure memory and endpoints — all from a single module block.

bedrockdockerecragents

EventBridge

Scheduled and event-driven Lambda execution. Cron expressions, rate schedules, event patterns — connected to the Lambda registry for clean references.

croneventsschedules

Three patterns that keep 2,000+ lines of HCL manageable

Single vs Multi Mode

Need one Lambda? Pass function_name and handler. Need twelve? Use the lambdas map instead. The module detects which mode you're using and adjusts internally — no flags, no mode switches.

This works across all major modules: Lambda, DynamoDB, S3. The interface is always the same shape — singular noun for one, plural map for many.

module "my-function" {
  source        = "../modules/lambda"
  function_name = "hello-world"
  handler       = "main.handler"
  runtime       = "python3.13"
  source_dir    = "./src"
}
module "functions" {
  source = "../modules/lambda"

  defaults = {
    runtime   = "python3.13"
    base_path = "../../lambdas/"
  }

  lambdas = {
    api-handler = { source_dir = "api" }
    processor   = { memory_size = 512 }
    notifier    = { timeout = 30 }
  }
}
Lambda Module Output
create arn:aws:lambda:...create
get arn:aws:lambda:...get
delete arn:aws:lambda:...delete
functions registry
API Gateway Routes
POST /urls create
GET /{id} get
DELETE /{id} delete

The Registry Pattern

When the Lambda module creates functions, it outputs a functions map — keys are the names you gave them, values contain the ARN, invoke ARN, and function name.

Downstream modules like API Gateway and EventBridge consume this registry. Routes reference functions by key ("create"), not by pasting ARNs around. Add a new function to the map, add a route pointing at its key — done.

The Defaults Pattern

Every multi-mode module accepts a defaults block. These values apply to all resources unless overridden per-resource. The merge logic depends on type:

  • Maps — deep merged, per-resource keys win
  • Lists — concatenated (defaults first, then resource-specific)
  • Scalars — per-resource overrides completely

This means you set runtime = "python3.13" once in defaults, and only the one Node.js function needs to specify its own runtime.

defaults
runtimepython3.13
memory_size256
timeout10
tags{ env = "prod" }
per-resource
memory_size512
tags{ team = "api" }
result
runtimepython3.13
memory_size512
timeout10
tags{ env, team }

Module composition in the main workspace

The workspace wires modules together. Lambda functions feed into API Gateway via the registry, DynamoDB provides persistence, S3 handles storage, and the MCP Gateway + Bedrock Runtime power the AI layer.

main/
einar/
composes
Lambda
API GW
DynamoDB
S3
MCP GW
Runtime
provisions
Functions
HTTP APIs
Tables
Buckets
Gateways
Agents

Two workspaces, four distinct services

t0mas.io main

URL Shortener

Five Lambda functions handle create, get, list, delete, and availability checks. DynamoDB stores the mappings, API Gateway routes traffic with rate limiting at 5 req/s.

5 Lambdas DynamoDB API Gateway
api.tomas.im main

Personal Gateway

A grab-bag API: serve a CV PDF from S3, return the caller's IP and geolocation, fetch local weather, and generate ideas with Bedrock — including a streaming variant for long responses.

5 Lambdas S3 Bedrock
Bedrock AgentCore main

AI Infrastructure

An MCP Gateway registers Canvas LMS tools for student queries. Five Docker-based agents run on Bedrock AgentCore — a personal chatbot, idea generator, study tutor, quiz and flashcard creators.

MCP Gateway 5 Agents Canvas LMS
api.eignatjon.is einar

Email Sender API

Handles contact form submissions for two domains. A Lambda processes the form, SES sends the email, and Bedrock analyzes the content — all behind API Gateway with CORS for specific origins.

Lambda SES Bedrock

Five agents, all deployed from the same module

Each agent is a Docker container built locally, pushed to ECR, and deployed as a Bedrock AgentCore runtime. They share the same Terraform module — the only differences are the Dockerfile and the system prompt.

Me

Personal chatbot — knows context about the developer, answers questions conversationally

Idea Agent

Multi-agent pipeline: Innovator generates ideas, Critic challenges them, Refiner polishes the survivors

Study Tutor

Socratic learning — answers questions with questions, guides understanding instead of giving answers

Quiz Generator

Creates quizzes from course material, with varied question types and difficulty levels

Flashcards

Generates spaced-repetition flashcards from study content for efficient memorization

Idea Agent Pipeline
Innovator
Generates raw ideas
Critic
Challenges & probes
Refiner
Polishes survivors

The technology choices and why they exist

Terraform >= 1.5.0

The entire infrastructure is declarative HCL. Version 1.5+ unlocked optional() in variable types, which made the defaults pattern possible without janky workarounds.

AWS Provider ~> 6.27

Pinned to the 6.x major to avoid breaking changes on terraform init. The tilde-arrow constraint allows patch updates automatically.

Python 3.10 / 3.12 / 3.13

Most Lambda functions run Python. Different versions coexist because some depend on libraries that haven't caught up yet — the Canvas MCP server needs 3.12 for canvasapi compatibility.

Node.js 18.x / 20.x / 22.x

The streaming API uses Node.js for its RESPONSE_STREAM invoke mode — Python's Lambda streaming support is less mature. Node 20.x is the primary runtime.

Docker Provider >= 3.6.0

Lambda and Runtime modules build Docker images locally and push to ECR. The Docker provider handles image hashing so Terraform knows when to rebuild.

Cloudflare Provider ~> 5.15

DNS validation for ACM certificates goes through Cloudflare. The certificate module creates CNAME records automatically and waits for validation.

Features start as specs, not code

Every feature goes through a specification phase before any Terraform gets written. The speckit workflow produces user stories, acceptance criteria, data models, and implementation tasks — all tracked in specs/<feature>/.

1

spec.md

User stories and acceptance criteria. What does "done" look like?

2

research.md

Technical research — API limitations, provider quirks, alternative approaches.

3

plan.md

Implementation strategy. Which files change, in what order, with what dependencies.

4

tasks.md

Numbered tasks with IDs: [X] T001 [P] [US1] Add variable in variables.tf

5

Implementation

Tasks get executed, validated with terraform validate, and checked off.