visit
Many of you probably have already heard about the Infrastructure as Code (IaC) process that employs a declarative approach to manage and provision system infrastructure via machine-readable definition files.
Now you can go to your project folder and find generated SAM template. Application Composer even defined DynamoDBCrudPolicy
execution policy for my Lambda functions (narrowed down to ToDosTable resource - least privilege principle, how cool is that?). However, I still recommend manually changing it to DynamoDBReadPolicy
for our GetToDos
function.
Let’s define our DynamoDB mapper first (I used @aws/dynamodb-data-mapper
library for this):
const {
DynamoDbSchema,
DynamoDbTable,
DataMapper,
} = require("@aws/dynamodb-data-mapper");
class ToDoItem {
get [DynamoDbTable]() {
return process.env.TABLE_NAME; // Table name will be passed via environment variables
}
get [DynamoDbSchema]() {
return {
Id: {
type: "String",
keyType: "HASH",
},
Title: { type: "String" },
CreatedAt: {
type: "Number",
},
ModifiedAt: {
type: "Number",
},
CompletedAt: {
type: "Number",
},
};
}
}
class ToDoItemMapper {
constructor(client) {
this.mapper = new DataMapper({
client, // the SDK client used to execute operations
});
}
scan() {
return this.mapper.scan(ToDoItem);
}
getById(id) {
const item = new ToDoItem();
item.Id = id;
return this.mapper.get(item);
}
put(item) {
return this.mapper.put(item);
}
update(item) {
return this.mapper.update(item);
}
}
const mapper = new ToDoItemMapper(ddbClient);
exports.handler = async () => {
const iterator = mapper.scan();
const todoItems = [];
for await (const record of iterator) {
todoItems.push(transform(record));
}
return {
statusCode: 200,
body: JSON.stringify(todoItems),
};
};
const mapper = new ToDoItemMapper(ddbClient);
const createToDo = async ({ title }) => {
if (!title) {
throw new Error(
"InvalidParameterException: title attribute is required"
);
}
const item = new ToDoItem();
const now = Date.now();
item.Id = uuid.v4();
item.Title = title;
item.CreatedAt = now;
item.ModifiedAt = now;
const persisted = await mapper.put(item);
return transformToModel(persisted);
};
const updateToDo = async (item) => {
if (!item.id) {
throw new Error("InvalidParameterException: id attribute is required");
}
const itemToUpdate = await mapper.getById(item.id);
itemToUpdate.ModifiedAt = Date.now();
itemToUpdate.Title = item.title;
itemToUpdate.CompletedAt = item.isCompleted === true ? Date.now() : undefined;
const persisted = await mapper.put(itemToUpdate);
return transformToModel(itemToUpdate);
};
exports.handler = async (event) => {
if (event.requestContext.httpMethod === "POST") {
const newItem = await createToDoItem(JSON.parse(event.body));
return {
statusCode: 200,
body: JSON.stringify(newItem),
};
}
if (event.requestContext.httpMethod === "PUT") {
const id = event.pathParameters.id;
const requestPayload = JSON.parse(event.body);
const updatedItem = await updateToDoItem({ ...requestPayload, id });
return {
statusCode: 200,
body: JSON.stringify(updatedItem),
};
}
return {
statusCode: 405,
body: "Method not supported",
};
};
You can find a full example in my GitHub .
Now let’s try to run our application locally before deploying it to production. SAM already comes with start-api
command that will start local API Gateway instance routing requests to local Lambda runtimes. However, we need to persist our data somewhere. The simplest solution would be to connect our local Lambdas to DynamoDB running in the cloud (for instance, if you have some staging environment that replicates production). But for our example, let's assume we don't have any environment setup yet and try to run in-memory DynamoDB database locally:
docker run -p 8000:8000 amazon/dynamodb-local
And here comes the first challenge: SAM also uses Docker to run local API Gateway and Lambda functions, and the docker container is NOT running the DynamoDB Local process INSIDE the container (localhost) — any request to //localhost:8000
within Lambda function will fail.
docker network create sam-demo-net
docker run -p 8000:8000 --network sam-demo-net --name ddblocal amazon/dynamodb-local
sam local start-api --env-vars json/env.json --docker-network sam-demo-net
Now we can use Docker’s service discovery feature and access DynamoDB local endpoint using the container name (ddblocal
):
const ddbClient = new DynamoDb({
...(process.env.AWS_SAM_LOCAL === "true"
? { endpoint: "//ddblocal:8000" }
: {}),
});
curl -X POST -d '{"title":"test ToDo"}' //127.0.0.1:3000/todos
{"id":"25962e09-7f16-4ab9-ac88-64f8c4a20710","title":"test ToDo","isCompleted":false}%
curl //127.0.0.1:3000/todos
[{"id":"25962e09-7f16-4ab9-ac88-64f8c4a20710","title":"test ToDo","isCompleted":false}]%
curl -X PUT -d '{"title":"test ToDo (completed)", "isCompleted": true}' //127.0.0.1:3000/todos/25962e09-7f16-4ab9-ac88-64f8c4a20710
{"id":"25962e09-7f16-4ab9-ac88-64f8c4a20710","title":"test ToDo (completed)","isCompleted":true}%
sam deploy --guided
--guided
flag will launch a wizard that will help you to configure deployment options (AWS CloudFormation stack name, AWS regions, etc). Once you complete this wizard the very first time, you will be offered to save this deployment setup and re-use it in the upcoming deployments.
Originally at on January 4, 2023.
You can also , , and to get notifications about new posts!