Nodejs AWS SDK S3 Generate Presigned URL
node.jsAmazon Web-ServicesAmazon S3Aws Sdk-Jsnode.js Problem Overview
I am using the NodeJS AWS SDK to generate a presigned S3 URL. The docs give an example of generating a presigned URL.
Here is my exact code (with sensitive info omitted):
const AWS = require('aws-sdk')
const s3 = new AWS.S3()
AWS.config.update({accessKeyId: 'id-omitted', secretAccessKey: 'key-omitted'})
// Tried with and without this. Since s3 is not region-specific, I don't
// think it should be necessary.
// AWS.config.update({region: 'us-west-2'})
const myBucket = 'bucket-name'
const myKey = 'file-name.pdf'
const signedUrlExpireSeconds = 60 * 5
const url = s3.getSignedUrl('getObject', {
Bucket: myBucket,
Key: myKey,
Expires: signedUrlExpireSeconds
})
console.log(url)
The URL that generates looks like this:
https://bucket-name.s3-us-west-2.amazonaws.com/file-name.pdf?AWSAccessKeyId=[access-key-omitted]&Expires=1470666057&Signature=[signature-omitted]
I am copying that URL into my browser and getting the following response:
<Error>
<Code>NoSuchBucket</Code>
<Message>The specified bucket does not exist</Message>
<BucketName>[bucket-name-omitted]</BucketName>
<RequestId>D1A358D276305A5C</RequestId>
<HostId>
bz2OxmZcEM2173kXEDbKIZrlX508qSv+CVydHz3w6FFPFwC0CtaCa/TqDQYDmHQdI1oMlc07wWk=
</HostId>
</Error>
I know the bucket exists. When I navigate to this item via the AWS Web GUI and double click on it, it opens the object with URL and works just fine:
https://s3-us-west-2.amazonaws.com/[bucket-name-omitted]/[file-name-omitted].pdf?X-Amz-Date=20160808T141832Z&X-Amz-Expires=300&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Signature=[signature-omitted]&X-Amz-Credential=ASIAJKXDBR5CW3XXF5VQ/20160808/us-west-2/s3/aws4_request&X-Amz-SignedHeaders=Host&x-amz-security-token=[really-long-key]
So I am led to believe that I must be doing something wrong with how I'm using the SDK.
node.js Solutions
Solution 1 - node.js
Dustin,
Your code is correct, double check following:
-
Your bucket access policy.
-
Your bucket permission via your API key.
-
Your API key and secret.
-
Your bucket name and key.
Solution 2 - node.js
Since this question is very popular and the most popular answer is saying your code is correct, but there is a bit of problem in the code which might lead a frustrating problem. So, here is a working code
AWS.config.update({
accessKeyId: ':)))',
secretAccessKey: ':DDDD',
region: 'ap-south-1',
signatureVersion: 'v4'
});
const s3 = new AWS.S3()
const myBucket = ':)))))'
const myKey = ':DDDDDD'
const signedUrlExpireSeconds = 60 * 5
const url = s3.getSignedUrl('getObject', {
Bucket: myBucket,
Key: myKey,
Expires: signedUrlExpireSeconds
});
console.log(url);
> The noticeable difference is the s3 object is created after the config update, without this the config is not effective and the generated url doesn't work.
Solution 3 - node.js
Here is the complete code for generating pre-signed (put-object) URL for any type of file in S3.
- If you want you can include expiration time using Expire parameter in parameter.
- The below code will upload any type of file like excel(xlsx, pdf, jpeg)
const AWS = require('aws-sdk');
const fs = require('fs');
const axios = require('axios');
const s3 = new AWS.S3();
const filePath = 'C:/Users/XXXXXX/Downloads/invoice.pdf';
var params = {
Bucket: 'testing-presigned-url-dev',
Key: 'dummy.pdf',
"ContentType": "application/octet-stream"
};
s3.getSignedUrl('putObject', params, function (err, url) {
console.log('The URL is', url);
fs.writeFileSync("./url.txt", url);
axios({
method: "put",
url,
data: fs.readFileSync(filePath),
headers: {
"Content-Type": "application/octet-stream"
}
})
.then((result) => {
console.log('result', result);
}).catch((err) => {
console.log('err', err);
});
});
Solution 4 - node.js
I had a use case where using node.js ; I wanted to get object from s3 and download it to some temp location and then give it as attachment to third-party service! This is how i broke the code:
- get signed url from s3
- make rest call to get object
- write that into local location
It may help anyone; if there is same use case; chekout below link; https://medium.com/@prateekgawarle183/fetch-file-from-aws-s3-using-pre-signed-url-and-store-it-into-local-system-879194bfdcf4
Solution 5 - node.js
For me, I was getting a 403 because the IAM role I had used to get the signed url was missing the S3:GetObject permission for the bucket/object in question. Once I added this permission to the IAM role, the signed url began to work correctly afterwards.
Solution 6 - node.js
Try this function with promise.
const AWS = require("aws-sdk");
const s3 = new AWS.S3({
accessKeyId: 'AK--------------6U',
secretAccessKey: 'kz---------------------------oGp',
Bucket: 'bucket-name'
});
const getSingedUrl = async () => {
const params = {
Bucket: 'bucket_name',
Key: 'file-name.pdf',
Expires: 60 * 5
};
try {
const url = await new Promise((resolve, reject) => {
s3.getSignedUrl('getObject', params, (err, url) => {
err ? reject(err) : resolve(url);
});
});
console.log(url)
} catch (err) {
if (err) {
console.log(err)
}
}
}
getSingedUrl()