Building Serverless Apps using the Serverless Stack Framework
17 June 2021
Updated: 03 September 2023
Prior to doing any of the below you will require your ~/.aws/credentials file to be configured with the credentials for your AWS account
Serverless Stack Framework
SST Framework is a framework built on top of CDK for working with Lambdas and other CDK constructs
It provides easy CDK setups and a streamlined debug and deploy process and even has integration with the VSCode debugger to debug stacks on AWS
Init Project
To init a new project use the following command:
Which will create a Serverless Stack applocation using TypeScript
Run the App
You can run the created project in using the config defined in the sst.json file:
Using the following commands command will build then deploy a dev stack and allow you to interact with it via AWS/browser/Postman/etc.
Additionally, running using the above command will also start the application with hot reloading enabled so when you save files the corresponding AWS resources will be redeployed so you can continue testing
The Files
The application is structured like a relatively normal Lambda/CDK app with lib which contains the following CDK code:
Stack
lib/index.ts
lib/MyStack.ts
And src which contains the lambda code:
src/lambda.ts
Add a new Endpoint
Using the defined constructs it’s really easy for us to add an additional endpoint:
src/hello.ts
And then in the stack we just update the routes:
lib/MyStack.ts
So that the full stack looks like this:
lib/MyStack.ts
VSCode Debugging
SST supports VSCode Debugging, all that’s required is for you to create a .vscode/launch.json filw with the following content:
.vscode/launch.json
This will then allow you to run Debug SST Start which will configure the AWS resources using the npm start command and connect the debugger to the instance so you can debug your functions locally as well as make use of the automated function deployment
We can define our table using the sst.Table class:
Next, we can add some endpoint definitions for the functions we’ll create as well as access to the table name via the environment:
And lastly we can grant the permissions to our api to access the table
Adding the above to the MyStack.ts file results in the following:
Before we go any further, we need to install some dependencies in our app, particularly uuid for generating unique id’s for notes, we can install a dependency with:
Define Common Structures
We’ll also create some general helper functions for returning responses of different types, you can view the details for their files below but these just wrap the response in a status and header as well as stringify the body
src/responses/successResponse.ts
src/responses/badResuestsResponse.ts
src/responses/internalErrorResponse.ts
And we’ve also got a Note type which will be the data that gets stored/retreived:
src/notes/Note.ts
Access DB
Once we’ve got a DB table defined as above, we can then access the table to execute different queries
We would create a DB object instance using:
Create
A create is the simplest one of the database functions for us to implement, this uses the db.put function with the Item to save which is of type Note:
Get
We can implement a getOne function by using db.get and providing the full Key consisting of the userId and noteId
GetAll
We can implement a getByUserId function which will make use of db.query and use the ExpressionAttributeValues to populate the KeyConditionExpression as seen below:
Define Lambdas
Now that we know how to write data to Dynamo, we can implement the following files for the endpoints we defined above:
Create
src/notes/create.ts
Get
src/notes/get.ts
GetAll
src/notes/getAll.ts
Testing
Once we’ve got all the above completed, we can actually test our endpoints and create and read back data
create:
Which responds with:
get:
getAll
Creating Notes Using a Queue
When working with microservices a common pattern is to use a message queue for any operations that can happen in an asynchronous fashion, we can create an SQS queue which we can use to stage messages and then separately save them at a rate that we’re able to process them
In order to make this kind of logic we’re going to break up our create data flow - a the moment it’s this:
We’re going to turn it into this:
This kind of pattern becomes especially useful if we’re doing a lot more stuff with the data other than just the single DB operation and also allows us to retry things like saving to the DB if we have errors, etc.
A more complex data flow could look something like this (not what we’re implementing):
Create Queue
SST provides us with the sst.Queue class that we can use for this purpose
To create a Queue you can use the following in stack:
The above code does the following:
Create a queue
Give the queue permission to access the table
Add the tableName environment variable to the queue’s consumerFunction
We will also need to grant permissions to the API to access the queue so that our create handler is able to add messages to the queue
Which means our Stack now looks like this:
lib/MyStack.ts
Update the Create Handler
Since we plan to create notes via a queue we will update our create function in the handler to create a new message in the queue, this is done using the SQS class from aws-sdk:
src/notes/create.ts
Once we’ve got our instance, the create function is done by means of the queue.sendMessage function:
src/notes/create.ts
Lastly, our handler remains mostly the same with the exception of some additional validation to check that we have the queue connection information in the environment:
src/notes/create.ts
Implementing the above into the create handler means that our create.ts file now looks like this:
src/notes/create.ts
Add Queue-Based Create Handler
Now that we’ve updated our logic to save the notes into the queue, we need to add the logic for the src/consumers/createNote.handler consumer function as we specified above, this handler will be sent an SQSEvent and will make use of the DynamoDB Table we gave it permissions to use
First, we take the create function that was previously on the create.ts file for saving to the DB:
src/consumers/createNote.ts
We’ll also need a function for parsing the SQSRecord object into a Note:
src/consumers/createNote.ts
And finally we consume the above through the handler, you can see in the below code that we are iterating over the event.Records object, this is because the SQSEvent adds each new event into this array, the reason for this is because we can also specify batching into our Queue so that the handler is only triggered after n events instead of each time, and though this isn’t happening in our case, we still should handle this for our handler:
src/consumers/createNote.ts
Putting all the above together our createNote.ts file now has the following code:
This completes the implementation of the asynchronous saving mechanism for notes. As far as a consumer of our API is concerned, nothing has changed and they will still be able to use the API exactly as we had in the Testing section above
Deploy
Thus far, we’ve just been running our API in debug mode via the npm run start command, while useful for testing this adds a lot of code to make debugging possible, and isn’t something we’d want in our final deployed code
Deploying using sst is still very easy, all we need to do is run the npm run deploy command and this will update our lambda to use a production build of the code instead:
Teardown
Lastly, the sst CLI also provides us with a function to teardown our start/deploy code. So once you’re done playing around you can use this to teardown all your deployed services:
Note that running the remove command will not delete the DB tables, you will need to do this manually