Thinking asynchronously: Building a Serverless Event-Driven Architecture on AWS.

Thinking asynchronously: Building a Serverless Event-Driven Architecture on AWS.

In my previous blog post, we explored the concept of event-driven architecture. Now, we will dive into its implementation. We will utilize AWS serverless technologies and examine why serverless and event-driven architecture complement each other well.

Let's start with the basic building block of our architecture -> An Event.

An event is a message or payload generated by a service, signifying information about an occurrence within the system. For instance, a "UserCreated" event indicates that a user has been created. Each event has an associated payload, which, in this case, could include data such as "userId" and "email." A service's sole responsibility is to generate an event, without concern for who may be interested in that event.

Now, we require a technology that will deliver these events to interested subscribers. This technology will maintain a record of all events, their producers, and the subscribers interested in them. This technology is known as a Router.

In AWS, there are two technologies for this purpose: SNS and EventBridge. We will be using EventBridge in our implementation, as it was specifically designed for EDAs.

So now our architecture flow for a user sign-up looks like

  • Producer service (running on AWS lambda) does some business logic and saves user info in Dynamodb.

  • In the same process, the producer service generates an Eventbridge event using putEvents SDK command.

  • Eventbridge listens to the generated event, match it with all the rules to see who is interested in this event, and pushes the same to the consumers

  • Consumers listen to this event and carry out their business logic for eg:- Sending an email, etc

async createUser(userData): Promise<void> {
     //... some validation and buisness logic
     await userRepository.create(userItem); //insert in db
     // create a custom event for eventbridge -> customEvent
     const params: PutEventsCommandInput = {
          Entries: [
              Source: 'userService',
              EventBusName: 'default',
              DetailType: 'User.Created',
              Detail: JSON.stringify(customEvent),
    const putEventsCommand = new PutEventsCommand(params);
    await eventBridgeClient.send(putEventsCommand); // send event

It seems quite simple, doesn't it? However, it's not quite that straightforward, as there are two conditions we need to handle.

  • Phantom event

  • Lost event

Phantom Event: Imagine we are registering a user, inserting their data into the database, and emitting a "UserCreated" event. However, due to reasons such as network failure, the commit in the database fails. Despite this, we still have a "UserCreated" event for a user that does not exist in the database. This type of event is referred to as a Phantom Event.

Lost Event: Using the same example, we insert the user data into the database, but something goes wrong while emitting the event, and we don't have an event for the corresponding user. This can be extremely critical for business operations.

We need a mechanism to do the following things in the transaction

  • Insertion in dynamodb.

  • Emission of Events via Eventbridge.

just like how we can insert multiple items in dynamodb in a transaction. There is no straightforward way to do this via SDK as dynamodb and eventbridge are completely independant systems, so we will develop our mechanism/pattern. In Dynamodb we can insert multiple items whether in the same table or other in a transaction. We will leverage this and insert an additional "event" item in a separate table which we will call an event store. The event item definition looks like this

    pk: eventId,
    sk: timestamp,
    event: CustomEvent

where CustomEvent is our definition of an event. This custom event's definition can vary according to your own needs. Here is one custom event's definition.

    headers: {
        eventId: string,
        correlationId: string,
        eventName: string,
        createdBy: string,   //user who created the event
        source: string, // service which generated the event
        region: string,
        timestamp: number
    payload: Record<string, unknown>

Now we need a way to detect each new item in our event store and raise an eventbridge event. Here come Dynamodb streams in the picture.

DynamoDB streams provide an ordered flow of information on changes to items in a table. When enabled, it captures data insertions/modifications and writes a stream record with the key attributes of the affected items.

Our architecture till now looks like below.

Dynamodb stream supports a lambda listener which will be responsible for

  • unmarshalling stream record to get CustomEvent

  • call event bridge putEvents command with the above event.

Once the event is raised from the event bridge, we can set some rules for consumers who are interested in this event. Our Final Architecture looks like this.

You might be wondering why are we sending the eventbridge event to an SQS queue when we can directly send it to the Lambda. When eventbridge invokes a lambda, it doesn't care if the lambda function is successful or not. So in order to do proper error handling we are sending the event to the sqs queue on which we can configure our error handling mechanism. The consumer messaging service fetches events from the queue and performs different business logic for eg:- Sending an email.

This pattern of using an event store and listening to each new item in the event store through a stream and acting on it is called as "Transactional Outbox Pattern".

In conclusion, the "Transactional Outbox Pattern" involves using an event store and listening to each new item in the event store through a stream, then acting on it. This approach effectively decouples the user and messaging services, allowing them to communicate with each other using events.