Event-Driven Architecture Without the PhD — A Practical Guide
Why Events Matter
A client's monolithic e-commerce platform had a checkout flow that took 8 seconds. Not because the payment was slow — because checkout triggered 14 synchronous operations: charge the card, send the confirmation email, update inventory, notify the warehouse, update analytics, sync to the CRM, trigger the loyalty program, update the recommendation engine, send to the affiliate network, log the audit trail, update the dashboard cache, notify the sales team, trigger the post-purchase flow, and generate the invoice.
Fourteen things that all had to finish before the customer saw "Order Confirmed." Most of them didn't need to happen synchronously.
Synchronous vs. Event-Driven
Synchronous Checkout (8 seconds):
Charge card (2s) → Send email (1s) → Update inventory (0.5s)
→ Notify warehouse (1s) → Update analytics (0.3s) → Sync CRM (1.5s)
→ ... 8 more operations → Show confirmation
Event-Driven Checkout (0.8 seconds):
Charge card (0.5s) → Update inventory (0.2s) → Show confirmation (0.1s)
→ Emit "order.placed" event
Then, asynchronously:
→ Email service handles "order.placed" → sends email
→ Warehouse service handles "order.placed" → creates shipment
→ Analytics service handles "order.placed" → tracks conversion
→ CRM service handles "order.placed" → syncs customer
→ ... all happen in parallel, in the background
The customer waits for the three things that must be synchronous: payment, inventory, and confirmation. Everything else happens after.
The Simple Implementation (No Kafka Required)
You don't need Kafka, RabbitMQ, or any message broker to start with events. Start with an in-process event emitter:
// events/emitter.ts — Simple in-process event system
type EventHandler<T = any> = (data: T) => Promise<void>;
class EventBus {
private handlers: Map<string, EventHandler[]> = new Map();
on(event: string, handler: EventHandler) {
const existing = this.handlers.get(event) || [];
this.handlers.set(event, [...existing, handler]);
}
async emit(event: string, data: any) {
const handlers = this.handlers.get(event) || [];
// Fire all handlers in parallel — don't await in the request path
Promise.allSettled(
handlers.map((handler) =>
handler(data).catch((err) => {
console.error(`Handler failed for ${event}:`, err);
// Log to error tracking, don't crash the request
})
)
);
}
}
export const eventBus = new EventBus();Register handlers at startup:
// events/handlers/order.ts
import { eventBus } from "../emitter";
eventBus.on("order.placed", async (order) => {
await emailService.sendOrderConfirmation(order);
});
eventBus.on("order.placed", async (order) => {
await warehouseService.createShipment(order);
});
eventBus.on("order.placed", async (order) => {
await analyticsService.trackPurchase(order);
});When to Add a Message Broker
The in-process event bus has limitations. Add a real message broker when you hit these:
| Problem | Solution | Tool |
|---|---|---|
| Handler failures lose events | Persistent message queue | SQS, Redis Streams |
| Need to process across services | Distributed messaging | RabbitMQ, SQS |
| Need event replay/audit trail | Event log with replay | Kafka, EventStore |
| High throughput (>10K events/sec) | Partitioned streaming | Kafka, Kinesis |
| Need exactly-once processing | Transactional outbox | PostgreSQL + worker |
For most applications, SQS or Redis Streams are the right starting point. Kafka is for when you have genuine streaming requirements at scale.
The Transactional Outbox Pattern
The biggest risk with events: what if the database transaction succeeds but the event fails to publish? Or vice versa? The transactional outbox pattern solves this:
// Write the event to the database in the SAME transaction as the business data
async function placeOrder(orderData: CreateOrderInput) {
return await db.transaction(async (tx) => {
// 1. Create the order
const order = await tx.orders.create(orderData);
// 2. Write the event to an outbox table (same transaction!)
await tx.outboxEvents.create({
eventType: "order.placed",
payload: JSON.stringify(order),
createdAt: new Date(),
processed: false,
});
return order;
});
// Transaction commits → both order AND event are saved atomically
}
// Separate worker polls the outbox and publishes events
async function processOutbox() {
const events = await db.outboxEvents.findMany({
where: { processed: false },
orderBy: { createdAt: "asc" },
take: 100,
});
for (const event of events) {
await messageQueue.publish(event.eventType, JSON.parse(event.payload));
await db.outboxEvents.update({
where: { id: event.id },
data: { processed: true, processedAt: new Date() },
});
}
}Idempotency: The Non-Negotiable Rule
Events can be delivered more than once (network retries, worker restarts). Every handler must be idempotent:
// BAD: Not idempotent — sends duplicate emails
eventBus.on("order.placed", async (order) => {
await emailService.send(order.email, "confirmation", order);
});
// GOOD: Idempotent — checks before processing
eventBus.on("order.placed", async (order) => {
const alreadySent = await db.emailLog.exists({
orderId: order.id,
template: "confirmation",
});
if (alreadySent) return; // Skip duplicate
await emailService.send(order.email, "confirmation", order);
await db.emailLog.create({
orderId: order.id,
template: "confirmation",
sentAt: new Date(),
});
});The Results
That 8-second checkout? After implementing event-driven architecture:
| Metric | Before | After |
|---|---|---|
| Checkout response time | 8.2s | 0.8s |
| Cart abandonment rate | 34% | 22% |
| Failed background tasks | Invisible (no retry) | 0.02% (with retry) |
| Time to add new post-checkout integration | 2-3 days | 2 hours |
The biggest win wasn't the speed — it was the decoupling. Adding a new post-checkout action (like a referral trigger) went from a multi-day task touching the checkout code to a 2-hour task writing a new event handler. No checkout code modified.
Start with the simple in-process event bus. Migrate to a message broker when you outgrow it. The architecture pattern stays the same — only the transport changes.