How to implement serverless event-driven architectures in AWS

Learn the key AWS services to implement a performant and cost-effective serverless event-driven architecture.

Abstract geometric background

Photo by Propel

Nowadays, one of the most popular ways to build modern applications is to go serverless. In a serverless computing model, developers don't have to worry about infrastructure management, which allows them to focus more on designing and building applications. Organizations can also save significant costs since serverless resources are charged based on usage. To automate a serverless application, you can make it responsive to certain input events. This is generally what's known as using an event-driven architecture (EDA) or creating a serverless, EDA application.

EDA offers a few key advantages. First, you're able to handle incoming events asynchronously, which allows you to more easily scale your application. By making each component of your application a separate, event-driven microservice, you can also effectively decouple the application. This helps reduce the number of dependencies your components have with each other, and it allows you to build and deploy each component independently.

One of the most popular and basic serverless event-driven workflows involves using an AWS Lambda function to listen for incoming events, then processing those events in your application code. For example, you might have an automated system that handles a shopping order before storing the information about the order in a backend database. This tutorial will explain the basic architecture involved in building such a workflow. You'll also learn about the key AWS services that you need to build scalable, event-driven architectures to handle a variety of real-world applications.

Key AWS Serverless Concepts and Services

To understand the basics of building an AWS serverless event-driven application, you need to learn a few key concepts and services.


The following concepts are referenced frequently throughout this article:

  • Event: In AWS, an event is a JSON object that contains data for your serverless application to process. Your Lambda function will receive this event as an input.
  • Event source: An event source is a producer of an event.
  • Event bus: An event bus is a pipeline that receives events from event sources, and then forwards them on to targets for processing based on rules that you define.  


In the example application in this tutorial, you'll use the following services:

  • AWS Lambda: AWS Lambda is a serverless, event-driven compute service. In Lambda, you can define functions that run arbitrary code without having to provision any servers.
  • Amazon EventBridge: Amazon EventBridge is an event bus service. As your event bus receives events, it can deliver them to specified targets in real-time.
  • Amazon DynamoDB: Amazon DynamoDB is a NoSQL database service. Your Lambda function will record details about each order in a DynamoDB table.

Implementing an Event-Driven Application in AWS

You're going to create a sample event-driven AWS microservice that stores information about incoming online orders. The architecture for this application might look something like this:

Event-driven microservice application that involves an EventBridge event bus, Lambda function, and DynamoDB table
Event-driven microservice application that involves an EventBridge event bus, Lambda function, and DynamoDB table

As shown in the above diagram, there are a few major components in the main microservice:

  • The EventBridge event bus will accept PutEvents API calls from Microservice X. This assumes that Microservice X is the front-end service responsible for accepting new orders and calling PutEvents to send events to your event bus.
  • The Lambda function is responsible for processing the order. This function listens to the event bus, and Lambda immediately invokes your function when it receives a new message. In this application, the main purpose of this function is simply to store details about the order in a DynamoDB table. However, the function can also invoke another microservice (Microservice Y) that processes the order.
  • The DynamoDB table is your data store for this application. It stores information about past orders that have already been processed.

We can easily implement this microservice using the AWS console.

Creating Your Execution Role

First, create an IAM role that will serve as your Lambda function's execution role. To do this, go to the Roles page in the IAM console, select Create role, and use the following settings:

  • Trusted entity type: AWS service
  • Use case: Lambda
  • Permissions policies: Add the <span class="code-exp">AmazonDynamoDBFullAccess</span> policy. This policies gives your Lambda function permissions to put items into a DynamoDB table. Optionally, you can also add the <span class="code-exp">AWSLambdaBasicExecutionRole</span> to grant your function permissions to push logs to CloudWatch, which is useful for debugging your function.
  • Role name: <span class="code-exp">microservice-role</span>

Finally, choose Create role.

Creating Your DynamoDB Table 

Next, navigate to the Tables page in the DynamoDB console. From there, select Create table and use the following settings:

  • Table name: <span class="code-exp">sample-table</span>
  • Partition key: <span class="code-exp">OrderId</span> (type: String). This represents the ID of an incoming order.

Choose Create table.

Creating Your Lambda Function

Now, you can create your Lambda function. Here is some basic handler code that implements the main functionality you want for your microservice:  

This function makes a few assumptions about the structure of the incoming event. Specifically, it assumes that the important details of the event are contained in the <span class="code-exp">body</span> field.

To deploy this function, create an Author from scratch Lambda function from the Lambda console. Name the function <span class="code-exp">sample-function</span>, and make sure you choose runtime Python 3.9. Under Change default execution role, choose Use existing role, and then choose the <span class="code-exp">microservice-role</span> that you created earlier.

After function creation is complete, you can directly copy paste the above code into the editor. Make sure you select Deploy to deploy these code changes.

Creating an Event Bus Rule

Next, you'll create a new event bus rule. Note that in this section, you're simply going to create a rule that allows any possible event message from Microservice X to arrive at the <span class"code-exp">default</span> event bus. Typically, you'll want to be more restrictive of these events.

In the Amazon EventBridge console, select the Create rule button. Name the rule <span class="code-exp">sample-rule</span>, and apply it to the <span class="code-exp">default</span> event bus. For Rule type, choose Rule with an event pattern. Choose Next.

For the Event source, choose All events. Review the warning that the console presents when you select this option. Choose Next.

On the next page, you're asked to choose the targets for this role. Choose AWS Service, and then look for Lambda function in the dropdown. For Function, select the <span class="code-exp">sample-function</span> you created earlier. This automatically sets up the current event bus as a trigger to your Lambda function along with all required permissions. Choose Next.

You can skip past the Tags page. After reviewing the details of your rule, choose Create rule.

Testing Your Event-Driven Setup

Now you're ready to test the setup. You can do this directly from the EventBridge console. Navigate to the event bus page of the EventBridge console and choose the <span class="code-exp">default</span> event bus. At the top right corner, choose Send events. Fill out the form with these details:

  • Event bus: <span class="code-exp">default</span>
  • Event source: <span class="code-exp">microservice_x</span>
  • Detail type: <span class="code-exp">microservice_x_test_event</span>

For Event detail, paste the following JSON:

This is what a typical event message might look like coming from Microservice X. The most important component of this event JSON is the <span class="code-exp">body</span> parameter, which contains all the relevant fields you want to store in your DynamoDB table. It contains the <span class="code-exp">OrderId</span>, which you set as the partition key of your DynamoDB table. It also contains additional fields, such as the ID of the <span class="code-exp">Item</span> the customer ordered, the <span class="code-exp">Amount</span> of the transaction, a boolean <span class="code-exp">IsMember</span> denoting whether or not the customer is a member, and the <span class="code-exp">ShippingMethod</span>.

Choose Send. EventBridge will send this event to your event bus, which then invokes your Lambda function. In order to verify whether this worked, navigate to the DynamoDB console and choose the <span class="code-exp">sample-table</span> you created earlier. You should see that an entry has been added to your table with the same details as the JSON event.

As an important aside, note that we manually sent this event from EventBridge for testing purposes only. In a real-world scenario, Microservice X produces this event and sends it to your event bus, which then sends it to the Lambda function.


In this article, you learned how to configure multiple AWS services to create a microservice that handles incoming order requests. Specifically, you created a Lambda function that listens to an EventBridge event bus, then records information about an order into a DynamoDB table. This is a prime example of an event-driven architecture that works in real time to process incoming event streams.

Propel is a Serverless Analytics API Platform that helps companies stuck with painfully slow customer dashboards, clunky in-product analytics experiences, or no analytics at all to build their next-generation analytics products in record time. The platform enables developers to build customer-facing analytics like dashboards, in-product analytics, or analytics APIs with a few lines of code. It takes care of aggregating, caching, and controlling multi-tenant access to data without having to manage any additional infrastructure. It provides product development teams a Metrics GraphQL API to ship native-looking analytics that load blazing-fast in days not months.

Related posts

Deliver the analytics your customers have been asking for

Start shipping today.