Development Guides





Queues are an extremely powerful tool when you have thousands of tasks that could block the main thread for no reason, as they could be simply processed in the background.

The default driver that is shipped provides an in-memory queue that is capable of processing thousands of jobs but using your own cache driver is just as easy.


Before we start, we need to establish what a few recurring variables and imports in this document refer to when they are used.

import { app, Container, Services } from "@arkecosystem/core-kernel";
  • The app import refers to the application instance which grants access to the container, configurations, system information and more.
  • The Container import refers to a namespace that contains all of the container specific entities like binding symbols and interfaces.
  • The Services import refers to a namespace that contains all of the core services. This generally will only be needed for type hints as Core is responsible for service creation and maintenance.

Queue Usage

Create an instance

const queue: Queue = app.get<QueueFactory>(Container.Identifiers.QueueFactory)();

Start the queue


Stop the queue


Pause the queue


Resume the queue


Clear the queue


Push a new job onto the default queue

queue.push(() => console.log("Hello World");

Push a new job onto the default queue after a delay

queue.later(60, () => console.log("Hello World"));

Push an array of jobs onto the default queue

    () => console.log("Hello World"),
    () => console.log("Hello World")

Get the size of the given queue



As explained in a previous article, it is possible to extend Core services due to the fact that a Manager pattern is used. Lets go over a quick example of how you could implement your own queue.

Implementing the Driver

Implementing a new driver is as simple as importing the queue contract that needs to be satisfied and implement the methods specified in it.

In this example we will use p-queue which is a promise-based queue with concurrency control.

import { Contracts } from "@arkecosystem/core-kernel";

export class MemoryQueue implements Contracts.Queue.Queue {
    private readonly queue: PQueue = new PQueue({ autoStart: false });

    public async start(): Promise<void> {
        await this.queue.start();

    public async stop(): Promise<void> {
        await this.queues.delete(queue);

    public async pause(): Promise<void> {
        await this.queue.pause();

    public async resume(): Promise<void> {
        await this.queue.resume();

    public async clear(): Promise<void> {
        await this.queue.clear();

    public async push<T = any>(fn: () => PromiseLike<T>): Promise<void> {

    public async later<T>(delay: number, fn: () => PromiseLike<T>): Promise<void> {
        setTimeout(() => this.push(fn), delay);

    public async bulk<T>(functions: (() => PromiseLike<T>)[]): Promise<void> {

    public size(): number {
        return this.queue.size;

Implementing the service provider

Now that we have implemented our memory driver for the queue service we can create a service provider to register it.

import { Container, Contracts, Providers, Services } from "@arkecosystem/core-kernel";

export class ServiceProvider extends Providers.ServiceProvider {
    public async register(): Promise<void> {
        const cacheManager: Services.Queue.QueueManager =<Services.Queue.QueueManager>(

        await cacheManager.extend("memory", MemoryQueue);
  1. We retrieve an instance of the cache manager that is responsible for managing queue drivers.
  2. We call the extend method with an asynchronous function which is responsible for creating the queue instance.
2021 © | All Rights Reserved
An Product