Database
DevOps
Database Migrations
for Serverless
Schema migrations, zero-downtime deployments, and rollback strategies for D1 and serverless databases.
Database migrations in serverless are terrifying. No persistent connection. No transaction across multiple Workers. One bad migration can take down your entire system with no easy rollback.
Here's how we handle migrations across D1 databases serving production traffic.
Zero-Downtime Migration Flow
1
Expand: Add new column
Add nullable column, don't touch existing code
2
Deploy: Write to both
New code writes to old AND new columns
3
Backfill: Migrate data
Copy existing data to new column
4
Switch: Read from new
Update reads to use new column
5
Contract: Drop old column
Remove old column after verification
Pattern 1: Migration File Structure
migrations/
migrations/
โโโ 0001_initial_schema.sql
โโโ 0002_add_leads_table.sql
โโโ 0003_add_property_status.sql
โโโ 0004_add_user_preferences.sql
โโโ 0005_add_analytics_events.sql
-- Each migration has a clear name
-- Sequential numbering ensures order
-- Never modify existing migrations
0003_add_property_status.sql
-- Migration: Add status column to properties
-- Author: Justin
-- Date: 2026-01-15
-- Reversible: Yes
-- UP
ALTER TABLE properties
ADD COLUMN status TEXT DEFAULT 'active';
CREATE INDEX idx_properties_status
ON properties(status);
-- DOWN (stored separately)
-- DROP INDEX idx_properties_status;
-- ALTER TABLE properties DROP COLUMN status;
Pattern 2: Migration Runner
migration-runner.ts
interface Migration {
id: number;
name: string;
up: string;
down: string;
}
async function runMigrations(db: D1Database): Promise<void> {
// Ensure migrations table exists
await db.exec(`
CREATE TABLE IF NOT EXISTS _migrations (
id INTEGER PRIMARY KEY,
name TEXT NOT NULL,
applied_at TEXT DEFAULT CURRENT_TIMESTAMP
)
`);
// Get applied migrations
const applied = await db
.prepare('SELECT id FROM _migrations ORDER BY id')
.all();
const appliedIds = new Set(applied.results.map(r => r.id));
// Get pending migrations
const pending = MIGRATIONS.filter(m => !appliedIds.has(m.id));
for (const migration of pending) {
console.log(`Applying migration: ${migration.name}`);
try {
// Run migration
await db.exec(migration.up);
// Record success
await db
.prepare('INSERT INTO _migrations (id, name) VALUES (?, ?)')
.bind(migration.id, migration.name)
.run();
console.log(`โ Applied: ${migration.name}`);
} catch (error) {
console.error(`โ Failed: ${migration.name}`, error);
throw error;
}
}
}
Pattern 3: Safe Column Rename
Renaming a column requires the expand-contract pattern:
safe-rename.sql
-- WRONG: Direct rename breaks existing code
ALTER TABLE leads RENAME COLUMN phone TO phone_number;
-- RIGHT: Expand-contract pattern
-- Step 1: Add new column (Migration 0010)
ALTER TABLE leads ADD COLUMN phone_number TEXT;
-- Step 2: Deploy code that writes to BOTH columns
-- INSERT INTO leads (phone, phone_number) VALUES (?, ?)
-- Step 3: Backfill existing data (Migration 0011)
UPDATE leads SET phone_number = phone WHERE phone_number IS NULL;
-- Step 4: Deploy code that reads from new column
-- SELECT phone_number FROM leads
-- Step 5: Drop old column (Migration 0012, weeks later)
ALTER TABLE leads DROP COLUMN phone;
Critical Rule
Never drop columns or tables in the same deployment as adding new ones. Wait at least one full deployment cycle to ensure no code references the old schema.
Pattern 4: Backfill with Batching
backfill.ts
async function backfillPhoneNumbers(db: D1Database): Promise<void> {
const BATCH_SIZE = 100;
let processed = 0;
let lastId = 0;
while (true) {
// Get batch of records to update
const batch = await db
.prepare(`
SELECT id, phone FROM leads
WHERE phone_number IS NULL AND id > ?
ORDER BY id LIMIT ?
`)
.bind(lastId, BATCH_SIZE)
.all();
if (batch.results.length === 0) break;
// Update batch
const updates = batch.results.map(row =>
db.prepare('UPDATE leads SET phone_number = ? WHERE id = ?')
.bind(row.phone, row.id)
);
await db.batch(updates);
processed += batch.results.length;
lastId = batch.results[batch.results.length - 1].id;
console.log(`Backfilled ${processed} records`);
// Small delay to avoid overwhelming the database
await sleep(100);
}
console.log(`Backfill complete: ${processed} total records`);
}
Migration Checklist
- Test migrations in staging with production-like data
- Always write both UP and DOWN migrations
- Never modify existing migrationsโcreate new ones
- Use expand-contract for column renames/type changes
- Batch large data backfills to avoid timeouts
- Take backups before destructive migrations
- Monitor database performance during migrations
- Keep migration files in version control
Migrations should be boring. Expand slowly, contract carefully, and always have a rollback plan. The goal is zero user-facing errors during schema changes.