Upload a binary file to S3 using AWS SDK for Node.js

Javascriptnode.jsAmazon S3Amazon Web-Services

Javascript Problem Overview


Update: For future reference, Amazon have now updated the documentation from what was there at time of asking. As per @Loren Segal's comment below:-

> We've corrected the docs in the latest preview release to document this parameter properly. Sorry about the mixup!


I'm trying out the developer preview of the AWS SDK for Node.Js and want to upload a zipped tarball to S3 using putObject.

According to the documentation, the Body parameter should be...

> Body - (Base64 Encoded Data)

...therefore, I'm trying out the following code...

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });

// Read in the file, convert it to base64, store to S3
fs.readFile('myarchive.tgz', function (err, data) {
  if (err) { throw err; }

  var base64data = new Buffer(data, 'binary').toString('base64');

  var s3 = new AWS.S3();
  s3.client.putObject({
    Bucket: 'mybucketname',
    Key: 'myarchive.tgz',
    Body: base64data
  }).done(function (resp) {
    console.log('Successfully uploaded package.');
  });
  
});

Whilst I can then see the file in S3, if I download it and attempt to decompress it I get an error that the file is corrupted. Therefore it seems that my method for 'base64 encoded data' is off.

Can someone please help me to upload a binary file using putObject?

Javascript Solutions


Solution 1 - Javascript

You don't need to convert the buffer to a base64 string. Just set body to data and it will work.

Solution 2 - Javascript

Here is a way to send a file using streams, which might be necessary for large files and will generally reduce memory overhead:

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });

// Read in the file, convert it to base64, store to S3
var fileStream = fs.createReadStream('myarchive.tgz');
fileStream.on('error', function (err) {
  if (err) { throw err; }
});  
fileStream.on('open', function () {
  var s3 = new AWS.S3();
  s3.putObject({
    Bucket: 'mybucketname',
    Key: 'myarchive.tgz',
    Body: fileStream
  }, function (err) {
    if (err) { throw err; }
  });
});

Solution 3 - Javascript

I was able to upload my binary file this way.

var fileStream = fs.createReadStream("F:/directory/fileName.ext");
var putParams = {
	Bucket: s3bucket,
	Key: s3key,
	Body: fileStream
};
s3.putObject(putParams, function(putErr, putData){
	if(putErr){
		console.error(putErr);
	} else {
		console.log(putData);
	}
});

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionisNaN1247View Question on Stackoverflow
Solution 1 - JavascriptAndyDView Answer on Stackoverflow
Solution 2 - JavascriptCaptEmulationView Answer on Stackoverflow
Solution 3 - JavascriptshaunView Answer on Stackoverflow