visit
When developing your web application, you must think of a place where to store your data, how to back up them, types of data you want to store, such as images, music, and videos, application hosting, data archiving, disaster recoveries. AWS Simple Storage Service (S3) provides you the solutions for these cases. S3 is one of the core services of AWS cloud infrastructure. It'sĀ object storageĀ that acts like a regular file system on your personal computer. S3 scales infinitely, with no limit on the amount of data you store.
In this tutorial, we'll get to learn how to use the AWS S3. First, learn what is S3, the core parts of S3 that are the Buckets, Access Point, and Objects. Then we'll get to the practice, by implementing the AWS SDK for Node.js š» Finally, we'll provide a cheat sheet on AWS S3 CMD Commands.We'll now go through the core concepts of S3, such asĀ buckets,Ā access points, andĀ objects.
To upload your data to S3, you must create an S3 bucket in one of the AWS Regions, within one bucket you can upload many objects to the bucket. For implementation, buckets and objects are resources, and S3 providesĀ APIsĀ for you to manage them. There are different methods you can use to create buckets such as.
Amazon S3 Console
Follow the guidesĀ Ā to create your first bucket with the S3 console.REST API
To create buckets using REST API, you must authenticate your requests ā follow theĀ Ā in the S3 API reference. But it's recommended to use the AWS Management Console or AWS SDKs instead.AWS SDK
To create buckets with the SDK, you first have to create aĀ clientĀ and then use the client to sendĀ a requestĀ toĀ create a bucket. Note: When creating the client and the bucket, use the same region. Here is a dominant on creating and using AWS S3 Buckets.
To access the data that you store on S3, you need theĀ S3 Access Point. These areĀ endpointsĀ that are attached to buckets that you used to perform S3 object operations.
Each access point has distinct permissions and network controls S3 applies for any request that is made through the access point. Access points are used to perform operations onĀ objects, butĀ notĀ onĀ buckets. Go through thisĀ Ā to learn how to manage data access with S3 access points.
We'll go with the AWS SDK and Node.js toĀ createĀ S3 buckets,Ā uploadingĀ an object to a specified bucket andĀ deletingĀ that bucket afterward; we'll provide aĀ How-To on the S3Ā section where you can learn more about different use-cases commands to run on S3.
In order to continue, you must:npm install aws-sdk --save.
// Load the AWS SDK for Node.js
var AWS = require("aws-sdk");
// Set the region
AWS.config.update({ region: "us-east-1" });
// Create S3 service object
s3 = new AWS.S3({ apiVersion: "2006-03-01" });
// Create the parameters for calling createBucket -- with this part we'll take the bucket name we'll create
var bucketParams = {
Bucket: process.argv[2],
};
// Call S3 to create the buckets
s3.createBucket(bucketParams, function(err, data) {
err ? console.log("Error", err) : console.log("Success", data.Location);
});
node createBucket.js webiny-s3-bucket-testing
// Load the AWS SDK for Node.js
var AWS = require("aws-sdk");
// Set the region
AWS.config.update({ region: "us-east-1" });
// Create S3 service object
s3 = new AWS.S3({ apiVersion: "2006-03-01" });
// Call S3 to retrieve upload file to specified bucket
var uploadParams = { Bucket: process.argv[2], Key: "", Body: "" };
var file = process.argv[3];
// Configure the file stream and obtain the upload parameters
// The node.js file system module allows you to work (read, create, update, delete, rename files)
// with the file system on your computer.
var fs = require("fs");
var readingFile = fs.createReadStream(file);
readingFile.on("error", function(err) {
console.log("File Error", err);
});
uploadParams.Body = readingFile;
// The path module provides utilities for working with file and directory paths.
// We can access by using this:
var path = require("path");
uploadParams.Key = path.basename(file);
// Call S3 to retrieve upload file to specified bucket
s3.upload(uploadParams, function(err, data) {
err ? console.log("Error", err) : console.log("Upload Success!", data.Location);
});
node upload.js webiny-s3-bucket-testing index.txt
// Load the AWS SDK for Node.js
var AWS = require("aws-sdk");
// Set the region
AWS.config.update({ region: "us-east-1" });
// Create S3 service object
s3 = new AWS.S3({ apiVersion: "2006-03-01" });
// Create the parameters for calling listObjects method
var bucketParams = {
// in here we'll provide the bucket name we created earlier
Bucket: "webiny-s3-bucket-testing",
};
// Call S3 to obtain a list of the objects in the bucket
s3.listObjects(bucketParams, function(err, data) {
err ? console.log("Error", err) : console.log("Success", data);
});
// Load the AWS SDK for Node.js
var AWS = require("aws-sdk");
// Set the region
AWS.config.update({ region: "us-east-1" });
// Create S3 service object
s3 = new AWS.S3({ apiVersion: "2006-03-01" });
// Create params for S3.deleteBucket
var bucketParams = {
// here you'll provide the name of the bucket you want to delete
Bucket: "webiny-s3-bucket-testing",
};
// We'll first empty the bucket
async function emptyS3Bucket(bucket) {
const listParams = {
Bucket: bucket,
// Prefix: dir,
};
const listedObjects = await s3.listObjectsV2(listParams).promise();
if (listedObjects.Contents.length === 0) return;
const deleteParams = {
Bucket: bucket,
Delete: { Objects: [] },
};
listedObjects.Contents.forEach(({ Key }) => {
deleteParams.Delete.Objects.push({ Key });
});
await s3.deleteObjects(deleteParams).promise();
if (listedObjects.IsTruncated) await emptyS3Bucket(bucket);
}
emptyS3Directory(bucketParams.Bucket);
// Call S3 to delete the bucket
s3.deleteBucket(bucketParams, function(err, data) {
err ? console.log("Error", err) : console.log("Success", data);
});
sudo easy_install awscli
// or
sudo pip install awscli
// or
brew install awscli
aws s3 sync s3://<source_bucket> <local_destination>
Example:Ā
aws s3 sync s3://mybucket
Will download all the objects inĀ
mybucket
Ā to the current directory. And will output:download: s3://mybucket/test.txt to test.txt
2. Are AWS S3 buckets region-specific?
The user interface shows all your buckets, in all regions. But buckets exist in a specific region and you need to specify that region when you create a bucket.3. How to Configure SSL for AWS S3 bucket?
3.1 Example:Ā
3.2 If you use a custom domain for your bucket, you can use S3 and
CloudFront together with your own SSL certificate (or generate a free one via Amazon Certificate Manager):Ā
4. Delete AWS S3 buckets
4.1
aws s3 rb s3://bucket-name
--force
option.Ā aws s3 rb s3://bucket-name --force
5. Rename AWS S3 Bucket name
5.1 There is no rename bucket functionality for S3 because there are technically no folders in S3, so we have to handle every file within the bucket.
aws s3 mb s3://[new-bucket] // 1. Create a new bucket
aws s3 sync s3://[old-bucket] s3://[new-bucket] // 2. Copy files over
aws s3 rb --force s3://[old-bucket] // 3. Delete the old bucket
6. Quick way to list all files in AWS S3 bucket
aws s3 ls
7. AWS S3 copy files and folders between two buckets
aws s3 sync s3://DOC-EXAMPLE-BUCKET-SOURCE s3://DOC-EXAMPLE-BUCKET-TARGET
8. Is it better to have multiple s3 buckets or one bucket with subfolders?
Object Storage āĀ Also known as object-based storage, is a strategy that manages and manipulates data storage as distinct units, called objects. There are three key components of an object āĀ the contentĀ of the object (data stored in the object such as a file or directory),Ā the unique object identifier (ID),Ā andĀ metadata. It stores the metadata as key-pair values and contains information such as name, size, date, security attributes, content type, and URL. Each object has an access control list (ACL) to configure whoĀ may access the object.
Now that you've used AWS SDK for S3, you're able to code the solutions that the AWS S3 Console provides via a few clicks, which is faster but, using the SDK you'll be able to continue developing your applications using the AWS services directly by coding. This is a significant advantage for those interested in building applications using AWS services. In this tutorial, we used the AWS SDK to create buckets, upload data, listing data from the buckets, empty, and afterward deleting buckets via AWS SDK for JavaScript for Node.js.If you learned somethingĀ newĀ today and are interested toĀ follow up on our blogs,Ā to our newsletter and we'll provide you the best content of the serverless world!
Thanks for reading! My name isĀ Ā and I work as a developer relations engineer atĀ . I enjoy learning new tech and building communities around them = ) If you have questions or just want to say hi, reach out to me viaĀ .Previously published at