Uploading to and downloading from S3 in Node.js using AWS SDK

AWS S3 is one of the most frequently used services in AWS. The SDK provides methods to manage buckets and objects at the application level, and write code that programmatically accesses S3. In this post, I'll use the upload() and getObject() methods to interact with S3.

S3 is a core service AWS offers, and as such, it’s often used in applications.

AWS provides a rich interface to interact with S3 for both in the SDK and the Command Line Interface (CLI).

The pre-requisites for uploading objects to and downloading them from S3 is to have credentials for programmatic access, and to follow this post, the CLI should also be installed.

1. The file

I’ll up- and download the Architecting for the Cloud - AWS Best Practices whitepaper. It has a size of a bit more than 600 kB, so it’s sort of a medium size. It’s also a good reading with all of its 50 pages.

To start, let’s save the pdf file to the root folder, and it will be read from there when it’s uploaded to S3.

2. Create the bucket - select a unique name

As a first step, you’ll need a bucket where the files are uploaded to and downloaded from.

The bucket can be created using the CLI. The bucket name must be unique in the whole world, so a creative name like s3-upload-and-download-suad123 can save the day (choose something else, because it’s already taken by me).

The following CLI command can be used from the terminal:

aws s3api create-bucket --bucket s3-upload-and-download-suad123 --region us-west-2 --create-bucket-configuration LocationConstraint=us-west-2

The region is us-west-2, and the --create-bucket-configuration is also needed for a region other than us-east-1.

If the bucket name is unique, the CLI should return a confirm response with the name of the bucket, otherwise it will return an error message.

3. Install the SDK

First, we’ll need to install the SDK. Navigate to the root folder of the project, enter npm init -y to the terminal, and then do an npm install aws-sdk.

I won’t bother creating folders here; all files will be placed in the root folder.

4. Upload the file to S3 using streams

Both the upload and the download code will be in separate modules, and they can be easily used in application codes, or can be attached to an endpoint as a handler.

In the file called s3-upload.js, we can have the following code:

const fs = require('fs').promises
const path = require('path')
const AWS = require('aws-sdk')

const s3 = new AWS.S3({
  region: 'us-west-2',
})

const BUCKET_NAME = 's3-upload-and-download-suad123'
const KEY = 'AWS_Cloud_Best_Practices.pdf'

const uploadToS3 = async () => {
  const pdfStream = fs.createReadStream(path.join(__dirname, `${ KEY }`))
  const params = {
    Bucket: BUCKET_NAME,
    Key: KEY,
    Body: pdfStream,
  }

  const response = await s3.upload(params).promise()
  return response
}

The region is specified at the service level. It’s a good idea if the application works with multiple services, because each of them can be configured with a different region.

Inside the uploadToS3 function, we first create a readable stream using the createReadStream method of the fs core Node module. Streams are useful if files of larger size are processed, and I consider this pdf as a larger size file now.

The params object contains the parameters needed to upload the file to AWS. Bucket is straightforward, and Key is going to be the name of the file we want to upload (the whitepaper pdf). If we wanted to create folders inside the bucket, we could do it either, and the folder should be specified here. For example, if we wanted to save the file in the whitepaper folder, the value of the Key property would be whitepaper/AWS_Cloud_Best_Practices.pdf.

Body is the actual file content we want to upload. The most often used types are stream (like in our case) or buffer, but it can also be blob, string or typed array.

The name of the method is upload, and it accepts params as an argument. If we chain the promise method to upload, we can await it, given the uploadToS3 function is marked as async.

We can now call uploadToS3:

uploadToS3()
  .then((data) => {
    console.log(data)
  }).catch((e) => {
    console.log(e)
  })

The return value should be similar to this:

{ ETag: '"285956f63ea996a14ecd6eee0bd6c357"',
  Location:
   'https://s3-upload-and-download-suad123.s3.us-west-2.amazonaws.com/AWS_Cloud_Best_Practices.pdf',
  key: 'AWS_Cloud_Best_Practices.pdf',
  Key: 'AWS_Cloud_Best_Practices.pdf',
  Bucket: 's3-upload-and-download-suad123' }

This is a good sign, but we can quickly double-check if the file exists using the list-object command in the CLI, where the only option to specify is the name of the bucket:

aws s3api list-objects --bucket s3-upload-and-download-suad123

If the upload was successful, the response looks like this:

{
  "Contents": [
    {
      "Key": "AWS_Cloud_Best_Practices.pdf",
      "LastModified": "2019-05-27T12:13:33.000Z",
      "ETag": "\"285956f63ea996a14ecd6eee0bd6c357\"",
      "Size": 670541,
      "StorageClass": "STANDARD",
      "Owner": {
        "DisplayName": "YOUR DISPLAY NAME",
        "ID": "YOUR UNIQUE ID"
      }
    }
  ]
}

You can also check if the file is uploaded by logging in to the console, and navigating to S3.

5. Download the file

Downloading an existing object from AWS is easy. This is how to apply it in a file called downloadObjectFromS3.js:

const fs = require('fs').promises
const path = require('path')
const AWS = require('aws-sdk')

const BUCKET_NAME = 's3-upload-and-download-suad123'
const KEY = 'AWS_Cloud_Best_Practices.pdf'

const downloadFromS3 = async () => {
  const params = {
    Bucket: BUCKET_NAME,
    Key: KEY,
  }

  const { Body } = await s3.getObject(params).promise()
  await fs.writeFile(`${ __dirname }/downloaded-file.pdf`, Body)

  return Body
}

The code is very similar to the one above. The name of the method is getObject, and we can also call the promise method on it.

The Body property of the response contains the actual downloaded data in a form of a buffer.

We can write this file to the root folder using the fs.writeFile method, where the first argument is the path ending in the name of the downloaded file, and the second argument is the buffer we received from AWS.

fs.writeFile is an asynchronous method, and by default, it comes with a callback. But in 2019, callbacks are not really popular (for good reason), and we can transform it into a Promise.

From Node v10, we can use an experimental feature in the fs module called promises. All we have to do is make fs equal to the promises property of the module, and the asynchronous writeFile method can be await-ed as is.

Alternatively, the promisify method from the util module can also be used:

const util = require('util')

const writeFilePromise = util.promisify(fs.writeFile)

promisify converts the method with callback to promise. In this case, we have to call writeFilePromise with the same arguments:

await writeFilePromise(`${ __dirname }/downloaded-file.pdf`, Body)

Either way, the downloaded-file.pdf file should be created in the root folder, and it should be identical to the whitepaper we uploaded earlier.

6. Conclusion

Two very useful methods of the AWS SDK for S3 are upload and getObject.

The first method uploads a file to the specified bucket, and the second one downloads it from there. Both methods can be used with async/await if the promise built-in SDK method is attached to them.

Thanks for reading, and see you next time.