Skip to content

Framework Integrations

glide-mq ships official integration packages for popular Node.js frameworks. Each provides REST API endpoints, real-time SSE event streaming, and framework-native patterns.

Available Integrations

PackageFrameworkPatternInstall
@glidemq/honoHonoMiddleware + typed routernpm i @glidemq/hono
@glidemq/fastifyFastify v5Two-plugin registrationnpm i @glidemq/fastify
@glidemq/nestjsNestJS 10+Module + decorators + DInpm i @glidemq/nestjs
@glidemq/hapiHapi.jsPlugin + Joi validationnpm i @glidemq/hapi
@glidemq/dashboardExpressDrop-in web UInpm i @glidemq/dashboard

Common Features

All framework integrations share these capabilities:

  • 24 REST endpoints for job management, queue control, and scheduler CRUD
  • Server-Sent Events (SSE) for real-time job lifecycle streaming
  • Lightweight producers for serverless environments (no Worker overhead)
  • Testing mode with in-memory queues (no Valkey required)
  • Graceful shutdown with connection cleanup

Choosing an Integration

If you use...Install
Hono (edge, Bun, Cloudflare Workers)@glidemq/hono - type-safe RPC, edge-native
Fastify (high-performance Node.js)@glidemq/fastify - encapsulation-aware, Zod validation
NestJS (enterprise, decorators, DI)@glidemq/nestjs - @Processor, @InjectQueue, full lifecycle
Hapi (enterprise, Joi validation)@glidemq/hapi - Joi schemas, access control, SSE
Express (dashboard UI)@glidemq/dashboard - drop-in web dashboard, no build step
Express/Koa/other (API only)Use the core HTTP Proxy - no integration package needed

Quick Start

Every integration follows the same pattern - declare queues, register the plugin, get endpoints:

typescript
// Example with Hono
import { Hono } from 'hono';
import { glideMQ, glideMQApi } from '@glidemq/hono';

const app = new Hono();
const connection = { addresses: [{ host: 'localhost', port: 6379 }] };

app.use('/mq/*', glideMQ({
  connection,
  queues: { emails: { name: 'emails' } },
}));

app.route('/mq', glideMQApi());

export default app;

See each integration's page for framework-specific setup, configuration options, and examples.

AI Framework Integrations

glide-mq works with popular AI SDKs via its AI-native job primitives. No additional packages are required - use the standard Queue and Worker classes.

FrameworkPatternExample
Vercel AI SDKgenerateText/streamText inside a Worker, job.stream() for token output, job.reportUsage() for metricswith-vercel-ai-sdk
LangChainLangChain chains inside a Worker, job.reportUsage() from response metadatawith-langchain

Vercel AI SDK

typescript
import { generateText } from 'ai';
import { Worker } from 'glide-mq';

const worker = new Worker('inference', async (job) => {
  const result = await generateText({
    model: openai('gpt-5.4'),
    prompt: job.data.prompt,
  });

  await job.reportUsage({
    model: 'gpt-5.4',
    tokens: {
      input: result.usage.inputTokens,
      output: result.usage.outputTokens,
    },
  });

  return { content: result.text };
}, { connection });

LangChain

typescript
import { ChatOpenAI } from '@langchain/openai';
import { Worker } from 'glide-mq';

const worker = new Worker('langchain', async (job) => {
  const response = await llm.invoke(messages);
  const usage = response.response_metadata?.tokenUsage;

  await job.reportUsage({
    model: 'gpt-5.4',
    tokens: {
      input: usage?.promptTokens ?? 0,
      output: usage?.completionTokens ?? 0,
    },
  });

  return { output: String(response.content) };
}, { connection });

See the AI Pipeline Examples for complete, runnable examples with both frameworks.

Released under the Apache-2.0 License.