Zero
Zero
Back

Implementing a Work Queue for Background Processing with Amazon SQS

Web application backends often need to perform long-running, compute-intensive tasks. In these cases, you can improve response times and scalability by moving the heavy tasks from the web server to a scalable pool of workers, using a message queue to communicate between the two.

Sam Magura

Sam Magura

Scattered sheets of paper

Web application backends perform most tasks within the context of an HTTP request. For example, the user issues a POST request to change their display name, which causes the backend to update the user record in the database and return a 200 response. While this approach is adequate for most CRUD operations, it is not ideal when:

  1. Performing the operation requires heavy computation,
  2. The operation could take over 20 seconds, so there is a risk that the HTTP request will timeout, or
  3. The operation must be automatically retried in the case of failure.

All three of these difficulties arise frequently in application development. For example, long-running and CPU-intensive processes are often required when preparing a report based on a large dataset, or when processing large files like videos, CSV uploads, and lengthy documents. The third constraint (the operation must be retried if it fails) is also common — if you need to send email, SMS, or push notifications, the system should retry sending the notification if a failure occurs.

The solution to all of these problems is to perform the work in the background, using the work queue  (or workers' queue) pattern. In this pattern, when your application needs to perform a computationally-intense task, it adds the task to a queue instead of starting the work immediately. One or more worker processes subscribe to the queue, listening for messages. When a message is received, the worker dequeues it and performs the task. In addition to solving the difficulties described above, the work queue pattern improves the scalability of your system since it is easy to increase the number of workers.

In this article, we'll walk through implementing the work queue pattern using a Next.js  web app, an Amazon SQS  message queue, and workers that run on AWS Lambda . The Next.js app will be deployed to Vercel and communicate with the AWS resources using the AWS TypeScript SDK. We'll store the AWS credentials in the Zero secrets manager and fetch them at runtime using the Zero TypeScript SDK.

🔗 The full code for this example is available in the zerosecrets/examples  GitHub repository.

Demo Project Overview

For our demonstration project, we'll build a webpage that allows the user to submit an order for a mock product. The user will enter their name, their address, and the quantity to purchase. When the form is submitted, the data is sent to the backend API.

Our hypothetical system might need to perform several tasks upon receiving the order, like selecting the most optimal shipping method and sending a confirmation email, so our backend will send the order data to the SQS queue instead of acting on it straightaway. From there, SQS will trigger our AWS Lambda worker function which will simply log the order data.

Secure your secrets conveniently

Zero is a modern secrets manager built with usability at its core. Reliable and secure, it saves time and effort.

Zero dashboard

Creating the Next.js App

Let's start by bootstrapping a new Next.js project:

1
2
3
4
shell
npx create-next-app@latest --experimental-app
   ✔ What is your project named? … web-app
   ✔ Would you like to use TypeScript with this project? … Yes
   ✔ Would you like to use ESLint with this project? … Yes

In app/page.tsx, we'll write an HTML form that requests the user's name, address, and the quantity to purchase. Storing each input's value in a useState is recommended, since this will give us easy access to the form data when the form is submitted.

When the form is complete, it should look something like this:

A form for purchasing a mock product
A form for purchasing a mock product

When the form is submitted, we send the user input to an /api/placeOrder API method and update the UI to show a success message, or an error message if something went wrong.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
typescript
const onSubmit: React.FormEventHandler<HTMLFormElement> = async (e) => {
  e.preventDefault()
  setSubmitted(false)

  if (!(name && address && quantity > 0)) {
    setErrorMessage('A required field is missing.')
    return
  }

  try {
    const response = await fetch('/api/placeOrder', {
      method: 'POST',
      body: JSON.stringify({
        productId: PRODUCT_ID,
        name,
        address,
        quantity,
      }),
    })

    if (!response.ok) {
      throw new Error(`Placing the order failed with status code ${response.status}.`)
    }

    setSubmitted(true)
    setErrorMessage(undefined)
  } catch (e) {
    setErrorMessage((e as any).message)
    console.error(e)
  }
}

The backend API method can be implemented as a Next.js API route by creating a new file at pages/api/placeOrder.ts. For now, our API method will just log the request body:

1
2
3
4
5
6
7
8
9
typescript
export default async function handler(req: NextApiRequest, res: NextApiResponse): Promise<void> {
  console.log('The API received an order:')
  console.log(req.body)
  console.log()

  // TODO Send the data to the SQS queue

  res.status(200).send('')
}

At this point, you should be able to submit the form and see the API handler log the name, address, and quantity to the console.

Setting up an SQS Queue using the CDK

The AWS Cloud Development Kit  (CDK) is a modern Infrastructure as Code (IaC) framework for AWS. In this section, we'll initialize a new CDK project and use it to create an Amazon SQS queue for the order data from the web app.

Both Amazon SQS and AWS Lambda have generous free tiers, so deploying the demo application to AWS will be absolutely free.

To get started, you'll need to install the AWS CLI  if you don't have it already. Once the CLI is installed, run aws configure to connect the CLI to your AWS account.

To bootstrap the CDK project, navigate to the directory which contains web-app and run mkdir sqs-lambda. cd into the new directory and run

1
shell
npx aws-cdk@latest init --language typescript

The AWS resources will be defined in the lib/sqs-lambda-stack.ts file. To define an SQS queue, simply instantiate a new sqs.Queue class:

1
2
3
4
5
6
7
8
9
10
11
12
13
typescript
import * as cdk from 'aws-cdk-lib'
import * as sqs from 'aws-cdk-lib/aws-sqs'
import {Construct} from 'constructs'

export class SqsLambdaStack extends cdk.Stack {
  constructor(scope: Construct, id: string, props?: cdk.StackProps) {
    super(scope, id, props)

    const queue = new sqs.Queue(this, 'OrderQueue', {
      visibilityTimeout: cdk.Duration.seconds(300),
    })
  }
}

Now we're ready to deploy to the cloud. To do this, execute:

1
2
3
4
shell
# Only needs to be run the first time
npm run cdk bootstrap

npm run cdk deploy

Once the deployment completes, you'll be able to see the SQS queue in the AWS Console. Make note of the queue's URL and ARN as we will need these to send messages to the queue.

Viewing the SQS queue in the AWS Console
Viewing the SQS queue in the AWS Console

Writing a Lambda Function

The next step is to add an AWS Lambda function to serve as the worker in our work queue implementation. Lambda is a great platform for this since it will automatically scale the number of instances of our function based on the number of messages flowing through the queue. Plus, you only pay for what you use, so you won't be paying for an expensive VM during periods of low usage.

The Lambda function can be defined directly inside our CDK project. Create a new directory called lambda-functions and place the following code in that directory in a file named handle-order.ts.

1
2
3
4
5
6
7
8
9
typescript
import {SQSEvent} from 'aws-lambda'

export function handler(event: SQSEvent): void {
  for (const record of event.Records) {
    const payload = JSON.parse(record.body)

    console.log(payload)
  }
}

The SQSEvent interface comes from the @types/aws-lambda package, which you can install via

1
shell
npm install --save-dev @types/aws-lambda

Since this is just a proof of concept, the Lambda function isn't particularly interesting. The one noteworthy thing is that we do a for loop over event.Records. This is necessary because a single SQSEvent may contain multiple messages.

Now let's add the Lambda function to the CDK stack:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
typescript
import * as lambda from 'aws-cdk-lib/aws-lambda'
import {SqsEventSource} from 'aws-cdk-lib/aws-lambda-event-sources'
import {NodejsFunction} from 'aws-cdk-lib/aws-lambda-nodejs'

// Add the following code directly below where you defined the SQS queue:

const handleOrderFunction = new NodejsFunction(this, 'HandleOrderFunction', {
  entry: 'lambda-functions/handle-order.ts',
  runtime: lambda.Runtime.NODEJS_18_X,
})

// Bind the Lambda function to the SQS queue
queue.grantConsumeMessages(handleOrderFunction)
handleOrderFunction.addEventSource(new SqsEventSource(queue, {}))

Using the NodejsFunction construct instead of the lambda.Function is necessary for the CDK to compile our TypeScript code to plain JavaScript as part of the deployment.

When you're ready, run npm run cdk deploy again to deploy the worker function to AWS.

Authorizing the Next.js App to Send Messages

With the queue and Lambda function created, it's time to return to the Next.js app. When the user submits their product order via the form, our Next.js API route handler will run. The API handler should send a message to the SQS queue, which will trigger the worker function.

Sending a message to SQS can be accomplished using the AWS JavaScript SDK , specifically the @aws-sdk/client-sqs package . But before we can send the message, we need to authorize the AWS SDK so that it is allowed to access the SQS queue.

We'll authenticate the SDK with AWS by creating an IAM user and an access key for that user. You can read the details about this authentication strategy in the AWS docs here . See Appendix A for a step-by-step guide on how to create the IAM user, policy, and access key.

After creating the access key, you'll have an Access Key ID and a Secret Access Key. Both of these values should be stored in Zero so the Next.js app can access them at runtime.

To store these values in Zero, you should:

  1. Log in or create an account at https://tryzero.com/.
  2. Create a new project called work-queue. Copy the project's Zero token to a safe location.
  3. Add a new AWS secret to the project:
Adding an AWS secret in Zero {532x543}
Adding an AWS secret in Zero {532x543}

Enqueueing a Message from the Next.js App

Now that the AWS credentials are stored securely in Zero, let's add a function to the Next.js app that fetches the credentials at runtime using the Zero TypeScript SDK. The SDK can be installed with

1
shell
npm install @zerosecrets/zero

Then we can define the credential-fetching function in a new file called src/util/getAwsCredentials.ts:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
typescript
import {zero} from '@zerosecrets/zero'

interface AwsCredentials {
  accessKeyId: string
  secretAccessKey: string
}

let credentials: AwsCredentials | undefined

export async function getAwsCredentials(): Promise<AwsCredentials> {
  // Reuse the same credentials if they have already been retrieved, so that we
  // don't call Zero on every request
  if (credentials) {
    return credentials
  }

  if (!process.env.ZERO_TOKEN) {
    throw new Error('Did you forget to set the ZERO_TOKEN environment variable?')
  }

  const secrets = await zero({
    token: process.env.ZERO_TOKEN,
    pick: ['aws'],
  }).fetch()

  if (!secrets.aws) {
    throw new Error('Did not receive an AWS secret.')
  }

  credentials = {
    accessKeyId: secrets.aws.aws_access_key_id,
    secretAccessKey: secrets.aws.aws_secret_access_key,
  }

  return credentials
}

We can create a similar function in src/utils/getSqsClient.ts that uses the AWS credentials to create an SQS client:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
typescript
import {SQSClient} from '@aws-sdk/client-sqs'
import {getAwsCredentials} from './getAwsCredentials'

let sqsClient: SQSClient | undefined

export async function getSqsClient(): Promise<SQSClient> {
  if (sqsClient) {
    return sqsClient
  }

  sqsClient = new SQSClient({
    region: process.env.MY_AWS_REGION,
    credentials: await getAwsCredentials(),
  })

  return sqsClient
}

MY_AWS_REGION is an environment variable which should be set to the AWS region your queue is in, e.g. us-east-2. This environment variable should be declared in a .env file so that we don't have to specify it each time we run the website. Note that we named the variable MY_AWS_REGION instead of AWS_REGION because AWS_REGION is a reserved environment variable in Vercel.

With the boilerplate out of the way, we can now update the pages/api/placeOrder.ts API route handler to push the order data to the SQS queue:

1
2
3
4
5
6
7
8
9
10
11
12
13
typescript
const sqsClient = await getSqsClient()

// In a real app you should validate the payload before sending it to the
// message queue
const command = new SendMessageCommand({
  QueueUrl: process.env.ORDER_QUEUE_URL,
  MessageBody: JSON.stringify(req.body),
})

await sqsClient.send(command)

console.log('Sent message to SQS.')
console.log()

ORDER_QUEUE_URL is an environment variable that should be set to the URL of your queue in the .env file, for example:

1
shell
ORDER_QUEUE_URL=https://sqs.us-east-2.amazonaws.com/288519792623/SqsLambdaStack-OrderQueue39B99167-84SxahV3mxBb

Let's test the changes out locally. Copy your Zero token and pass it to the Next.js app by running

1
shell
ZERO_TOKEN='...' npm run dev

Open the site in your web browser and submit the form. Then navigate to CloudWatch in the AWS Console, click "Log groups", and select the log group for the HandleOrderFunction Lambda function. If everything worked, you should see that the Lambda function logged the order data. 🥳

Viewing the Lambda function logs in CloudWatch
Viewing the Lambda function logs in CloudWatch

Deploying to Vercel

You may now deploy the Next.js site to Vercel  if you wish. Deploying to Vercel is free for hobby projects, and extremely easy. After logging into Vercel with your GitHub account, simply click the "Add New..." button and select "Project". Select the correct GitHub repository when prompted and Vercel will start deploying your site.

Deploying a Next.js web app to Vercel
Deploying a Next.js web app to Vercel

At this point, you should be able to view your site at https://your-project-name.vercel.app. Though, submitting an order won't work just yet because we have not told Vercel about the environment variables that are needed to run the web app.

To fix this, navigate to the settings page for the Vercel project and select "Environment Variables". Then add each of the following variables so that the production web app can use them:

  • ZERO_TOKEN
  • MY_AWS_REGION
  • ORDER_QUEUE_URL

Vercel does not read .env files by default which is why we have to manually enter the MY_AWS_REGION and ORDER_QUEUE_URL variables.

Now, if you test the Vercel-hosted version of the site, the form data should flow through the SQS queue to the Lambda function as before.

Cleaning Up

When you're done, remember to run

1
shell
npm run cdk destroy

in the sqs-lambda directory to delete the AWS resources you created.

Wrapping Up

The work queue pattern is an extremely useful technique for offloading tasks from your main web app onto a collection of workers that runs in the background. You should consider using this pattern when you need to perform compute-intensive tasks or you need to retry tasks until they succeed.

This article demonstrated how to implement the work queue pattern using a Next.js web app, an Amazon SQS message queue, and an AWS Lambda worker function. Even if your real app's stack differs from what we used in this article, you can still apply the same general ideas to add background processing to your app.

If implementing the work queue pattern in a production app, it's important to consider what will happen if a worker fails while processing a message. It's usually best to retry processing the message, though you should take care to avoid looping infinitely if there is a message that consistently causes the worker to fail.

Appendix A: Creating an IAM User and Policy in AWS

  1. Navigate to IAM in the AWS Console.
  2. Click "Users" and then "Add users".
  3. Enter order-queue-writer for the user name and check the "Access key" box.
    Adding an IAM user
  4. On the next screen, select "Attach existing policies directly".
  5. Click "Create policy" and create a new policy that only allows the SendMessage operation on the order queue. You'll need the order queue's ARN for this step.
    Creating an IAM policy
  6. Back on the page where you selected "Attach existing policies directly", search for the policy you just created and select it.
    Attaching the IAM policy to the new user
  7. Proceed through the wizard and create the user.

Finally, open the new user in the IAM console, switch to the "Security credentials" tab, and click the "Create access key" button.


Other articles

Envelopes laid out on a table

Sending Transactional Email with the Mailchimp API

Almost every production application needs to send transactional email, e.g. for password resets and notifications. This article will walk you through integrating a Next.js web app with the Mailchimp Transactional Email API.

Old cash register

The Complete Guide to Getting Started with the Braintree GraphQL API

Braintree is one of the world's leading payment platforms. This article provides an in-depth guide to getting started with Braintree, from getting your API token all the way to collecting customer payment information and charging a credit card.

Secure your secrets

Zero is a modern secrets manager built with usability at its core. Reliable and secure, it saves time and effort.