← Portfolio

Distributed Task Queue

A lightweight distributed task queue built on Redis and TypeScript, designed for reliable background job processing in Node.js microservices.

typescript
redis
node.js
microservices
docker

Distributed Task Queue

A production-grade background job processor built around Redis streams and TypeScript. Designed to replace heavyweight solutions like Bull/BullMQ for teams that want full control over their queue implementation.

Problem

Most teams reach for an off-the-shelf queue library without understanding its failure modes. When jobs silently drop or the queue backs up, debugging is painful because the library’s internals are opaque.

This project was built with the opposite philosophy: every component is explicit, inspectable, and replaceable.

Architecture

Producer ──► Redis Stream ──► Consumer Group ──► Worker Pool

                                    └──► Dead Letter Queue (failed jobs)

Key design decisions:

  • Redis Streams over Pub/Sub — persistence, consumer groups, and at-least-once delivery built in
  • Consumer groups for horizontal scaling without coordination overhead
  • Claim for stuck job recovery — any worker can claim a job that’s been processing too long

Features

  • At-least-once delivery with configurable retry count and backoff
  • Dead letter queue for failed jobs with full error context
  • Job priority levels (HIGH, NORMAL, LOW queues)
  • Dashboard UI for queue monitoring (built in Astro)
  • Graceful shutdown with in-flight job draining
  • OpenTelemetry spans on every job execution

Usage

import { Queue, Worker } from '@jensdn/task-queue';

const queue = new Queue({ redis: process.env.REDIS_URL });

// Enqueue a job
await queue.add('send-email', {
  to: 'user@example.com',
  template: 'welcome',
});

// Process jobs
const worker = new Worker('send-email', async (job) => {
  await sendEmail(job.data);
});

worker.on('failed', (job, err) => {
  logger.error('Job failed', { jobId: job.id, error: err.message });
});

Performance

Benchmarked at 12,000 jobs/second on a single Redis node with 8 worker threads. Latency p99 under 5ms for job pickup.

Deployment

docker compose up -d redis
npm run build
npm start