Skip to content

bunqueue — High-Performance Job Queue for Bun with SQLite & MCP

bunqueue is a high-performance job queue written in TypeScript, designed specifically for the Bun runtime. Built for AI agents and agentic workflows with a native MCP server.

  • Native Bun - Built from the ground up for Bun, leveraging bun:sqlite for maximum performance
  • Zero Redis - No external dependencies. SQLite provides persistence with WAL mode for concurrent access
  • BullMQ-Compatible API - Familiar patterns if you’re migrating from BullMQ
  • Production Ready - Stall detection, DLQ, rate limiting, webhooks, and S3 backups
  • MCP Server for AI Agents - 73 MCP tools included. AI agents can schedule tasks, manage pipelines, and monitor queues via natural language
┌─────────────────────────────────────────────────────────────┐
│ bunqueue server │
├─────────────────────────────────────────────────────────────┤
│ HTTP API (Bun.serve) │ TCP Protocol (Bun.listen) │
├─────────────────────────────────────────────────────────────┤
│ Core Engine │
│ ┌─────────┐ ┌─────────┐ ┌──────────┐ ┌─────────┐ │
│ │ Queues │ │ Workers │ │ Scheduler│ │ DLQ │ │
│ └─────────┘ └─────────┘ └──────────┘ └─────────┘ │
├─────────────────────────────────────────────────────────────┤
│ bun:sqlite (WAL mode) │ S3 Backup (optional) │
└─────────────────────────────────────────────────────────────┘
EmbeddedTCP Server
Use caseSingle process appsMulti-process / Microservices
SetupZero configRun bunqueue start
Optionembedded: trueDefault (no option)
PersistenceDATA_PATH env var--data-path flag

Use bunqueue as a library directly in your application:

import { Queue, Worker } from 'bunqueue/client';
// ⚠️ BOTH must have embedded: true
const queue = new Queue('tasks', { embedded: true });
const worker = new Worker('tasks', async (job) => {
// Process job
}, { embedded: true });

Best for:

  • Single-process applications
  • Serverless functions
  • Simple use cases

Run bunqueue as a standalone server:

Terminal window
# Start the server
bunqueue start --data-path ./data/queue.db

Then connect from your application:

import { Queue, Worker } from 'bunqueue/client';
// No embedded option = connects to localhost:6789
const queue = new Queue('tasks');
const worker = new Worker('tasks', async (job) => {
// Process job
});

Best for:

  • Multi-process workers
  • Microservices architecture
  • Language-agnostic clients (HTTP API)
FeaturebunqueueBullMQ
RuntimeBunNode.js
StorageSQLiteRedis
External depsNoneRedis server
Priority queues
Delayed jobs
Retries with backoff
Cron/repeatable jobs
Rate limiting
Stall detection
Parent-child flows
Advanced DLQBasic
S3 backups
Sandboxed workers
Durable writes✅ (Redis AOF)
MCP server (AI agents)✅ (73 tools)
Workflow engine✅ (saga, branching, parallel, retry, signals, nested, loops, forEach, map, schema validation, subscribe)

bunqueue includes a built-in workflow engine for multi-step orchestration. Define workflows with a fluent TypeScript DSL — saga compensation, conditional branching, parallel steps, step retry with backoff, nested sub-workflows, signal timeouts, loops (doUntil/doWhile), forEach iteration, map transforms, schema validation (Zod-compatible), per-execution subscribe, typed observability events, and cleanup/archival. No Temporal, no Inngest, no cloud service required.

import { Workflow, Engine } from 'bunqueue/workflow';
const flow = new Workflow('order')
.step('validate', async (ctx) => { /* ... */ })
.step('charge', async (ctx) => { /* ... */ }, {
compensate: async () => { /* auto-rollback on failure */ },
retry: 3,
})
.parallel((w) => w
.step('notify-warehouse', async () => { /* ... */ })
.step('send-email', async () => { /* ... */ })
)
.forEach(
(ctx) => (ctx.input as any).items,
'process-item', async (ctx) => { /* ctx.steps.__item */ },
)
.map('summary', (ctx) => ({ total: /* aggregate */ 0 }))
.waitFor('approval', { timeout: 86400000 })
.subWorkflow('payment', (ctx) => ({ amount: 99 }))
.step('ship', async (ctx) => { /* ... */ });
const engine = new Engine({ embedded: true });
engine.on('step:retry', (e) => console.warn(e));
engine.register(flow);
await engine.start('order', { orderId: 'ORD-1' });

See Workflow Engine guide for full documentation.