Building Serverless Apps using the Serverless Stack Framework
17 June 2021
Updated: 03 September 2023
Prior to doing any of the below you will require your
~/.aws/credentials
file to be configured with the credentials for your AWS account
Serverless Stack Framework
SST Framework is a framework built on top of CDK for working with Lambdas and other CDK constructs
It provides easy CDK setups and a streamlined debug and deploy process and even has integration with the VSCode debugger to debug stacks on AWS
Init Project
To init a new project use the following command:
1npx create-serverless-stack@latest my-sst-app --language typescript
Which will create a Serverless Stack applocation using TypeScript
Run the App
You can run the created project in using the config defined in the sst.json
file:
1{2 "name": "my-sst-app",3 "stage": "dev",4 "region": "us-east-1",5 "lint": true,6 "typeCheck": true7}
Using the following commands command will build then deploy a dev stack and allow you to interact with it via AWS/browser/Postman/etc.
1npm run start
Additionally, running using the above command will also start the application with hot reloading enabled so when you save files the corresponding AWS resources will be redeployed so you can continue testing
The Files
The application is structured like a relatively normal Lambda/CDK app with lib
which contains the following CDK code:
Stack
lib/index.ts
1import MyStack from './MyStack'2import * as sst from '@serverless-stack/resources'3
4export default function main(app: sst.App): void {5 // Set default runtime for all functions6 app.setDefaultFunctionProps({7 runtime: 'nodejs12.x',8 })9
10 new MyStack(app, 'my-stack')11
12 // Add more stacks13}
lib/MyStack.ts
1import * as sst from '@serverless-stack/resources'2
3export default class MyStack extends sst.Stack {4 constructor(scope: sst.App, id: string, props?: sst.StackProps) {5 super(scope, id, props)6
7 // Create the HTTP API8 const api = new sst.Api(this, 'Api', {9 routes: {10 'GET /': 'src/lambda.handler',11 },12 })13
14 // Show API endpoint in output15 this.addOutputs({16 ApiEndpoint: api.httpApi.apiEndpoint,17 })18 }19}
And src
which contains the lambda code:
src/lambda.ts
1import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'2
3export const handler: APIGatewayProxyHandlerV2 = async (4 event: APIGatewayProxyEventV25) => {6 return {7 statusCode: 200,8 headers: { 'Content-Type': 'text/plain' },9 body: `Hello, World! Your request was received at ${event.requestContext.time}.`,10 }11}
Add a new Endpoint
Using the defined constructs it’s really easy for us to add an additional endpoint:
src/hello.ts
1import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'2
3export const handler: APIGatewayProxyHandlerV2 = async (4 event: APIGatewayProxyEventV25) => {6 const response = {7 data: 'Hello, World! This is another lambda but with JSON',8 }9
10 return {11 statusCode: 200,12 headers: { 'Content-Type': 'application/json' },13 body: JSON.stringify(response),14 }15}
And then in the stack we just update the routes:
lib/MyStack.ts
1const api = new sst.Api(this, 'Api', {2 routes: {3 'GET /': 'src/lambda.handler',4 'GET /hello': 'src/hello.handler', // new endpoint handler5 },6})
So that the full stack looks like this:
lib/MyStack.ts
1import * as sst from '@serverless-stack/resources'2
3export default class MyStack extends sst.Stack {4 constructor(scope: sst.App, id: string, props?: sst.StackProps) {5 super(scope, id, props)6
7 // Create the HTTP API8 const api = new sst.Api(this, 'Api', {9 routes: {10 'GET /': 'src/lambda.handler',11 'GET /hello': 'src/hello.handler',12 },13 })14
15 // Show API endpoint in output16 this.addOutputs({17 ApiEndpoint: api.httpApi.apiEndpoint,18 })19 }20}
VSCode Debugging
SST supports VSCode Debugging, all that’s required is for you to create a .vscode/launch.json
filw with the following content:
.vscode/launch.json
1{2 "version": "0.2.0",3 "configurations": [4 {5 "name": "Debug SST Start",6 "type": "node",7 "request": "launch",8 "runtimeExecutable": "npm",9 "runtimeArgs": ["start"],10 "port": 9229,11 "skipFiles": ["<node_internals>/**"]12 },13 {14 "name": "Debug SST Tests",15 "type": "node",16 "request": "launch",17 "runtimeExecutable": "${workspaceRoot}/node_modules/.bin/sst",18 "args": ["test", "--runInBand", "--no-cache", "--watchAll=false"],19 "cwd": "${workspaceRoot}",20 "protocol": "inspector",21 "console": "integratedTerminal",22 "internalConsoleOptions": "neverOpen",23 "env": { "CI": "true" },24 "disableOptimisticBPs": true25 }26 ]27}
This will then allow you to run Debug SST Start
which will configure the AWS resources using the npm start
command and connect the debugger to the instance so you can debug your functions locally as well as make use of the automated function deployment
Add a DB
From these docs
We can define our table using the sst.Table
class:
1const table = new sst.Table(this, 'Notes', {2 fields: {3 userId: sst.TableFieldType.STRING,4 noteId: sst.TableFieldType.NUMBER,5 },6 primaryIndex: {7 partitionKey: 'userId',8 sortKey: 'noteId',9 },10})
Next, we can add some endpoint definitions for the functions we’ll create as well as access to the table name via the environment:
1const api = new sst.Api(this, 'Api', {2 defaultFunctionProps: {3 timeout: 60, // increase timeout so we can debug4 environment: {5 tableName: table.dynamodbTable.tableName,6 },7 },8 routes: {9 // .. other routes10 'GET /notes': 'src/notes/getAll.handler', // userId in query11 'GET /notes/{noteId}': 'src/notes/get.handler', // userId in query12 'POST /notes': 'src/notes/create.handler',13 },14})
And lastly we can grant the permissions to our api
to access the table
1api.attachPermissions([table])
Adding the above to the MyStack.ts
file results in the following:
1import * as sst from '@serverless-stack/resources'2
3export default class MyStack extends sst.Stack {4 constructor(scope: sst.App, id: string, props?: sst.StackProps) {5 super(scope, id, props)6
7 const table = new sst.Table(this, 'Notes', {8 fields: {9 userId: sst.TableFieldType.STRING,10 noteId: sst.TableFieldType.STRING,11 },12 primaryIndex: {13 partitionKey: 'userId',14 sortKey: 'noteId',15 },16 })17
18 // Create the HTTP API19 const api = new sst.Api(this, 'Api', {20 defaultFunctionProps: {21 timeout: 60, // increase timeout so we can debug22 environment: {23 tableName: table.dynamodbTable.tableName,24 },25 },26 routes: {27 // .. other routes28 'GET /notes': 'src/notes/getAll.handler', // userId in query29 'GET /notes/{noteId}': 'src/notes/get.handler', // userId in query30 'POST /notes': 'src/notes/create.handler',31 },32 })33
34 api.attachPermissions([table])35
36 // Show API endpoint in output37 this.addOutputs({38 ApiEndpoint: api.httpApi.apiEndpoint,39 })40 }41}
Before we go any further, we need to install some dependencies in our app, particularly uuid
for generating unique id’s for notes, we can install a dependency with:
1npm install uuid2npm install aws-sdk
Define Common Structures
We’ll also create some general helper functions for returning responses of different types, you can view the details for their files below but these just wrap the response in a status and header as well as stringify the body
src/responses/successResponse.ts
1const successResponse = <T>(item: T) => {2 return {3 statusCode: 200,4 headers: { 'Content-Type': 'application/json' },5 body: JSON.stringify(item),6 }7}8
9export default successResponse
src/responses/badResuestsResponse.ts
1const badRequestResponse = (msg: string) => {2 return {3 statusCode: 400,4 headers: { 'Content-Type': 'text/plain' },5 body: msg,6 }7}8
9export default badRequestResponse
src/responses/internalErrorResponse.ts
1const internalErrorResponse = (msg: string) => {2 console.error(msg)3 return {4 statusCode: 500,5 headers: { 'Content-Type': 'text/plain' },6 body: 'internal error',7 }8}9
10export default internalErrorResponse
And we’ve also got a Note
type which will be the data that gets stored/retreived:
src/notes/Note.ts
1type Note = {2 userId: string3 noteId: string4 content?: string5 createdAt: number6}7
8export default Note
Access DB
Once we’ve got a DB table defined as above, we can then access the table to execute different queries
We would create a DB object instance using:
1const db = new DynamoDB.DocumentClient()
Create
A create
is the simplest one of the database functions for us to implement, this uses the db.put
function with the Item
to save which is of type Note
:
1const create = async (tableName: string, item: Note) => {2 await db.put({ TableName: tableName, Item: item }).promise()3}
Get
We can implement a getOne
function by using db.get
and providing the full Key
consisting of the userId
and noteId
1const getOne = async (tableName: string, noteId: string, userId: string) => {2 const result = await db3 .get({4 TableName: tableName,5 Key: {6 userId: userId,7 noteId: noteId,8 },9 })10 .promise()11
12 return result.Item13}
GetAll
We can implement a getByUserId
function which will make use of db.query
and use the ExpressionAttributeValues
to populate the KeyConditionExpression
as seen below:
1const getByUserId = async (tableName: string, userId: string) => {2 const result = await db3 .query({4 TableName: tableName,5 KeyConditionExpression: 'userId = :userId',6 ExpressionAttributeValues: {7 ':userId': userId,8 },9 })10 .promise()11
12 return result.Items13}
Define Lambdas
Now that we know how to write data to Dynamo, we can implement the following files for the endpoints we defined above:
Create
src/notes/create.ts
1import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'2import { DynamoDB } from 'aws-sdk'3import { v1 } from 'uuid'4import internalErrorResponse from '../responses/internalErrorResponse'5import successResponse from '../responses/successResponse'6import badRequestResponse from '../responses/badRequestResponse'7import Note from './Note'8
9const db = new DynamoDB.DocumentClient()10
11const toItem = (data: string, content: string): Note => {12 return {13 userId: data,14 noteId: v1(),15 content: content,16 createdAt: Date.now(),17 }18}19
20const parseBody = (event: APIGatewayProxyEventV2) => {21 const data = JSON.parse(event.body || '{}')22
23 return {24 userId: data.userId,25 content: data.content,26 }27}28
29const isValid = (data: Partial<Note>) =>30 typeof data.userId !== 'undefined' && typeof data.content !== 'undefined'31
32const create = async (tableName: string, item: Note) => {33 await db.put({ TableName: tableName, Item: item }).promise()34}35
36export const handler: APIGatewayProxyHandlerV2 = async (37 event: APIGatewayProxyEventV238) => {39 if (typeof process.env.tableName === 'undefined')40 return internalErrorResponse('tableName is undefined')41
42 const tableName = process.env.tableName43 const data = parseBody(event)44
45 if (!isValid(data))46 return badRequestResponse('userId and content are required')47
48 const item = toItem(data.userId, data.content)49 await create(tableName, item)50
51 return successResponse(item)52}
Get
src/notes/get.ts
1import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'2import { DynamoDB } from 'aws-sdk'3import badRequestResponse from '../responses/badRequestResponse'4import internalErrorResponse from '../responses/internalErrorResponse'5import successResponse from '../responses/successResponse'6
7type RequestParams = {8 noteId?: string9 userId?: string10}11
12const db = new DynamoDB.DocumentClient()13
14const parseBody = (event: APIGatewayProxyEventV2): RequestParams => {15 const pathData = event.pathParameters16 const queryData = event.queryStringParameters17
18 return {19 noteId: pathData?.noteId,20 userId: queryData?.userId,21 }22}23
24const isValid = (data: RequestParams) =>25 typeof data.noteId !== 'undefined' && typeof data.userId !== 'undefined'26
27const getOne = async (tableName: string, noteId: string, userId: string) => {28 const result = await db29 .get({30 TableName: tableName,31 Key: {32 userId: userId,33 noteId: noteId,34 },35 })36 .promise()37
38 return result.Item39}40
41export const handler: APIGatewayProxyHandlerV2 = async (42 event: APIGatewayProxyEventV243) => {44 const data = parseBody(event)45
46 if (typeof process.env.tableName === 'undefined')47 return internalErrorResponse('tableName is undefined')48
49 const tableName = process.env.tableName50
51 if (!isValid(data))52 return badRequestResponse(53 'noteId is required in path, userId is required in query'54 )55
56 const items = await getOne(57 tableName,58 data.noteId as string,59 data.userId as string60 )61
62 return successResponse(items)63}64import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'65import { DynamoDB } from 'aws-sdk'66import badRequestResponse from '../responses/badRequestResponse'67import internalErrorResponse from '../responses/internalErrorResponse'68import successResponse from '../responses/successResponse'69
70type RequestParams = {71 noteId?: string72 userId?: string73}74
75const db = new DynamoDB.DocumentClient()76
77const parseBody = (event: APIGatewayProxyEventV2): RequestParams => {78 const pathData = event.pathParameters79 const queryData = event.queryStringParameters80
81 return {82 noteId: pathData?.noteId,83 userId: queryData?.userId,84 }85}86
87const isValid = (data: RequestParams) =>88 typeof data.noteId !== 'undefined' && typeof data.userId !== 'undefined'89
90const getOne = async (tableName: string, noteId: string, userId: string) => {91 const result = await db92 .get({93 TableName: tableName,94 Key: {95 userId: userId,96 noteId: noteId,97 },98 })99 .promise()100
101 return result.Item102}103
104export const handler: APIGatewayProxyHandlerV2 = async (105 event: APIGatewayProxyEventV2106) => {107 const data = parseBody(event)108
109 if (typeof process.env.tableName === 'undefined')110 return internalErrorResponse('tableName is undefined')111
112 const tableName = process.env.tableName113
114 if (!isValid(data))115 return badRequestResponse(116 'noteId is required in path, userId is required in query'117 )118
119 const items = await getOne(120 tableName,121 data.noteId as string,122 data.userId as string123 )124
125 return successResponse(items)126}
GetAll
src/notes/getAll.ts
1import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'2import { DynamoDB } from 'aws-sdk'3import badRequestResponse from '../responses/badRequestResponse'4import internalErrorResponse from '../responses/internalErrorResponse'5import successResponse from '../responses/successResponse'6
7type PathParams = {8 userId?: string9}10
11const db = new DynamoDB.DocumentClient()12
13const parseBody = (event: APIGatewayProxyEventV2): PathParams => {14 const data = event.queryStringParameters15
16 return {17 userId: data?.userId,18 }19}20
21const isValid = (data: PathParams) => typeof data.userId !== 'undefined'22
23const getByUserId = async (tableName: string, userId: string) => {24 const result = await db25 .query({26 TableName: tableName,27 KeyConditionExpression: 'userId = :userId',28 ExpressionAttributeValues: {29 ':userId': userId,30 },31 })32 .promise()33
34 return result.Items35}36
37export const handler: APIGatewayProxyHandlerV2 = async (38 event: APIGatewayProxyEventV239) => {40 const data = parseBody(event)41
42 if (typeof process.env.tableName === 'undefined')43 return internalErrorResponse('tableName is undefined')44
45 const tableName = process.env.tableName46
47 if (!isValid(data)) return badRequestResponse('userId is required in query')48
49 const items = await getByUserId(tableName, data.userId as string)50
51 return successResponse(items)52}
Testing
Once we’ve got all the above completed, we can actually test our endpoints and create and read back data
create
:
1POST https://AWS_ENDPOINT_HERE/notes2
3{4 "userId": "USER_ID",5 "content": "Hello world"6}
Which responds with:
12002
3{4 "content": "Hello world",5 "createdAt": 1619177078298,6 "noteId": "NOTE_ID_UUID",7 "userId": "USER_ID"8}
get
:
1GET https://AWS_ENDPOINT_HERE/notes/NOTE_ID_UUID?userId=USER_ID
12002
3{4 "content": "Hello world",5 "createdAt": 1619177078298,6 "noteId": "NOTE_ID_UUID",7 "userId": "USER_ID"8}
getAll
1GET htttps://AWS_ENDPOINT_HERE/notes?userId=USER_ID
12002
3[4 {5 "content": "Hello world",6 "createdAt": 1619177078298,7 "noteId": "NOTE_ID_UUID",8 "userId": "USER_ID"9 }10]
Creating Notes Using a Queue
When working with microservices a common pattern is to use a message queue for any operations that can happen in an asynchronous fashion, we can create an SQS queue which we can use to stage messages and then separately save them at a rate that we’re able to process them
In order to make this kind of logic we’re going to break up our create
data flow - a the moment it’s this:
1lambda -> dynamo2return <-
We’re going to turn it into this:
1lambda1 -> sqs2 return <-3
4 sqs -> lambda2 -> dynamo
This kind of pattern becomes especially useful if we’re doing a lot more stuff with the data other than just the single DB operation and also allows us to retry things like saving to the DB if we have errors, etc.
A more complex data flow could look something like this (not what we’re implementing):
1lambda1 -> sqs2 return <-3
4 sqs -> lambda2 -> dynamo // save to db5 -> lambda3 -> s3 // generate a report6 sqs <-7
8 sqs -> lambda4 // send an email
Create Queue
SST provides us with the sst.Queue
class that we can use for this purpose
To create a Queue you can use the following in stack:
1const queue = new sst.Queue(this, 'NotesQueue', {2 consumer: 'src/consumers/createNote.handler',3})4
5queue.attachPermissions([table])6queue.consumerFunction?.addEnvironment(7 'tableName',8 table.dynamodbTable.tableName9)
The above code does the following:
- Create a
queue
- Give the queue permission to access the
table
- Add the
tableName
environment variable to thequeue
’sconsumerFunction
We will also need to grant permissions to the API to access the queue
so that our create
handler is able to add messages to the queue
1api.attachPermissions([table, queue])
Which means our Stack now looks like this:
lib/MyStack.ts
1import * as sst from '@serverless-stack/resources'2
3export default class MyStack extends sst.Stack {4 constructor(scope: sst.App, id: string, props?: sst.StackProps) {5 super(scope, id, props)6
7 const table = new sst.Table(this, 'Notes', {8 fields: {9 userId: sst.TableFieldType.STRING,10 noteId: sst.TableFieldType.STRING,11 },12 primaryIndex: {13 partitionKey: 'userId',14 sortKey: 'noteId',15 },16 })17
18 const queue = new sst.Queue(this, 'NotesQueue', {19 consumer: 'src/consumers/createNote.handler',20 })21
22 queue.attachPermissions([table])23 queue.consumerFunction?.addEnvironment(24 'tableName',25 table.dynamodbTable.tableName26 )27
28 // Create the HTTP API29 const api = new sst.Api(this, 'Api', {30 defaultFunctionProps: {31 timeout: 60, // increase timeout so we can debug32 environment: {33 tableName: table.dynamodbTable.tableName,34 queueUrl: queue.sqsQueue.queueUrl,35 },36 },37 routes: {38 'GET /': 'src/lambda.handler',39 'GET /hello': 'src/hello.handler',40 'GET /notes': 'src/notes/getAll.handler',41 'POST /notes': 'src/notes/create.handler',42 'GET /notes/{noteId}': 'src/notes/get.handler',43 },44 })45
46 api.attachPermissions([table, queue])47
48 // Show API endpoint in output49 this.addOutputs({50 ApiEndpoint: api.httpApi.apiEndpoint,51 })52 }53}
Update the Create Handler
Since we plan to create notes via a queue we will update our create
function in the handler to create a new message in the queue
, this is done using the SQS
class from aws-sdk
:
src/notes/create.ts
1import { SQS } from 'aws-sdk'2
3const queue = new SQS()
Once we’ve got our instance, the create
function is done by means of the queue.sendMessage
function:
src/notes/create.ts
1const create = async (queueUrl: string, item: Note) => {2 return await queue3 .sendMessage({4 QueueUrl: queueUrl,5 DelaySeconds: 0,6 MessageBody: JSON.stringify(item),7 })8 .promise()9}
Lastly, our handler
remains mostly the same with the exception of some additional validation to check that we have the queue
connection information in the environment:
src/notes/create.ts
1export const handler: APIGatewayProxyHandlerV2 = async (2 event: APIGatewayProxyEventV23) => {4 // pre-save validation5 if (typeof process.env.queueUrl === 'undefined')6 return internalErrorResponse('queueUrl is undefined')7
8 const queueUrl = process.env.queueUrl9
10 const data = parseBody(event)11
12 if (!isValid(data))13 return badRequestResponse('userId and content are required')14
15 // save process16 const item = toItem(data.userId, data.content)17 const creatresult = await create(queueUrl, item)18
19 if (!creatresult.MessageId) internalErrorResponse('MessageId is undefined')20
21 return successResponse(item)22}
Implementing the above into the create
handler means that our create.ts
file now looks like this:
src/notes/create.ts
1import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'2import { v1 } from 'uuid'3import internalErrorResponse from '../responses/internalErrorResponse'4import successResponse from '../responses/successResponse'5import badRequestResponse from '../responses/badRequestResponse'6import Note from './Note'7import { SQS } from 'aws-sdk'8
9const queue = new SQS()10
11// helper functions start12
13const toItem = (data: string, content: string): Note => {14 return {15 userId: data,16 noteId: v1(),17 content: content,18 createdAt: Date.now(),19 }20}21
22const parseBody = (event: APIGatewayProxyEventV2) => {23 const data = JSON.parse(event.body || '{}')24
25 return {26 userId: data.userId,27 content: data.content,28 }29}30
31const isValid = (data: Partial<Note>) =>32 typeof data.userId !== 'undefined' && typeof data.content !== 'undefined'33
34// helper functions end35
36const create = async (queueUrl: string, item: Note) => {37 return await queue38 .sendMessage({39 QueueUrl: queueUrl,40 DelaySeconds: 0,41 MessageBody: JSON.stringify(item),42 })43 .promise()44}45
46export const handler: APIGatewayProxyHandlerV2 = async (47 event: APIGatewayProxyEventV248) => {49 // pre-save validation50 if (typeof process.env.queueUrl === 'undefined')51 return internalErrorResponse('queueUrl is undefined')52
53 const queueUrl = process.env.queueUrl54
55 const data = parseBody(event)56
57 if (!isValid(data))58 return badRequestResponse('userId and content are required')59
60 // save process61 const item = toItem(data.userId, data.content)62 const creatresult = await create(queueUrl, item)63
64 if (!creatresult.MessageId) internalErrorResponse('MessageId is undefined')65
66 return successResponse(item)67}
Add Queue-Based Create Handler
Now that we’ve updated our logic to save the notes into the queue
, we need to add the logic for the src/consumers/createNote.handler
consumer function as we specified above, this handler will be sent an SQSEvent
and will make use of the DynamoDB Table we gave it permissions to use
First, we take the create
function that was previously on the create.ts
file for saving to the DB:
src/consumers/createNote.ts
1import { DynamoDB } from 'aws-sdk'2
3const db = new DynamoDB.DocumentClient()4
5const create = async (tableName: string, item: Note) => {6 const createResult = await db7 .put({ TableName: tableName, Item: item })8 .promise()9 if (!createResult) throw new Error('create failed')10
11 return createResult12}
We’ll also need a function for parsing the SQSRecord
object into a Note
:
src/consumers/createNote.ts
1const parseBody = (record: SQSRecord): Note => {2 const { noteId, userId, content, createdAt } = JSON.parse(record.body) as Note3
4 // do this to ensure we only extract information we need5 return {6 noteId,7 userId,8 content,9 createdAt,10 }11}
And finally we consume the above through the handler
, you can see in the below code that we are iterating over the event.Records
object, this is because the SQSEvent
adds each new event into this array, the reason for this is because we can also specify batching into our Queue so that the handler is only triggered after n
events instead of each time, and though this isn’t happening in our case, we still should handle this for our handler:
src/consumers/createNote.ts
1export const handler: SQSHandler = async (event) => {2 // pre-save environment check3 if (typeof process.env.tableName === 'undefined')4 throw new Error('tableName is undefined')5
6 const tableName = process.env.tableName7
8 for (let i = 0; i < event.Records.length; i++) {9 const r = event.Records[i]10 const item = parseBody(r)11 console.log(item)12
13 const result = await create(tableName, item)14 console.log(result)15 }16}
Putting all the above together our createNote.ts
file now has the following code:
1import { SQSHandler, SQSRecord } from 'aws-lambda'2import Note from '../notes/Note'3import { DynamoDB } from 'aws-sdk'4
5const db = new DynamoDB.DocumentClient()6
7const create = async (tableName: string, item: Note) => {8 const createResult = await db9 .put({ TableName: tableName, Item: item })10 .promise()11 if (!createResult) throw new Error('create failed')12
13 return createResult14}15
16const parseBody = (record: SQSRecord): Note => {17 const { noteId, userId, content, createdAt } = JSON.parse(record.body) as Note18
19 // do this to ensure we only extract information we need20 return {21 noteId,22 userId,23 content,24 createdAt,25 }26}27
28export const handler: SQSHandler = async (event) => {29 if (typeof process.env.tableName === 'undefined')30 throw new Error('tableName is undefined')31
32 const tableName = process.env.tableName33
34 for (let i = 0; i < event.Records.length; i++) {35 const r = event.Records[i]36 const item = parseBody(r)37 console.log(item)38
39 const result = await create(tableName, item)40 console.log(result)41 }42}
This completes the implementation of the asynchronous saving mechanism for notes. As far as a consumer of our API is concerned, nothing has changed and they will still be able to use the API exactly as we had in the Testing section above
Deploy
Thus far, we’ve just been running our API in debug
mode via the npm run start
command, while useful for testing this adds a lot of code to make debugging possible, and isn’t something we’d want in our final deployed code
Deploying using sst
is still very easy, all we need to do is run the npm run deploy
command and this will update our lambda to use a production build of the code instead:
1npm run deploy
Teardown
Lastly, the sst
CLI also provides us with a function to teardown our start
/deploy
code. So once you’re done playing around you can use this to teardown all your deployed services:
1npm run remove
Note that running the
remove
command will not delete the DB tables, you will need to do this manually