Amazon S3 downloads index.html instead of serving
Amazon S3Amazon S3 Problem Overview
I've set up Amazon S3 to serve my static site, speakeasylinguistics.com
. All of the DNS stuff seems to be working okay, because dig +recurse +trace www.speakeasylinguistics.com
outputs the correct DNS info.
But when you visit the site in a browser using the endpoint, the index.html
page downloads, instead of being served. How do I fix this?
I've tried Chrome, Safari, FF. It happens on all of them. I used Amazon's walkthrough on hosting a custom domain to a T.
Amazon S3 Solutions
Solution 1 - Amazon S3
Running curl -I against the url you posted gives the following result:
curl -I http://speakeasylinguistics.com.s3-website-us-east-1.amazonaws.com/
HTTP/1.1 200 OK
x-amz-id-2: DmfUpbglWQ/evhF3pTiXYf6c+gIE8j0F6mw7VmATOpfc29V5tb5YTeojC68jE7Rd
x-amz-request-id: E233603809AF9956
Date: Sun, 18 Aug 2013 07:58:55 GMT
Content-Disposition: attachment
Last-Modified: Sun, 18 Aug 2013 07:05:20 GMT
ETag: "eacded76ceb4831aaeae2805c892fa1c"
Content-Type: text/html
Content-Length: 2585
Server: AmazonS3
This line is the culprit:
Content-Disposition: attachment
If you are using the AWS console, I believe this can be changed by selecting the file in S3 and modifying its meta data by removing this property.
Solution 2 - Amazon S3
If you are using Hashicorp Terraform you can specify the content-type
on an aws_s3_bucket_object as follows
resource "aws_s3_bucket_object" "index" {
bucket = "yourbucketnamehere"
key = "index.html"
content = "<h1>Hello, world</h1>"
content_type = "text/html"
}
This should serve your content appropriately in the browser.
Edit 24/05/22: As mentioned in the comments on this answer, Terraform now has a module to help with uploading files and setting their content-type
attribute correctly
Solution 3 - Amazon S3
If you are doing this programmatically you can set the ContentType
and/or ContentDisposition
params in your upload.
[PHP Example]
$output = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => md5($share). '.html',
'ContentType' => 'text/html',
'Body' => $share,
));
Solution 4 - Amazon S3
if you guys are trying to upload it with Boto3 and python 3.7 or above try with
s3 = boto3.client('s3')
S3.upload_file(local_file,bucket,S3_file,ExtraArgs={'ContentType':'text/html'})
for update Content-Type
Solution 5 - Amazon S3
For anyone else facing this issue, there's a typo in the URL you can find under Properties > Static website hosting
. For instance, the URL provided is
http://{bucket}.s3-website-{region}.amazonaws.com
but it should be
http://{bucket}.s3-website.{region}.amazonaws.com
Note the .
between website
and region
.
Solution 6 - Amazon S3
I have recently had the same issue popping up, the problem was a change of behavior of CloudFront & S3 Origin, If your S3Bucket is configured to serve a static website, you need to change your origin to be the HTTPS:// endpoint instead of picking the S3 origin from the pulldown, if you are using terraform, your origin should be aws_s3_bucket.var.website_endpoint instead of aws_s3_bucket.var.bucket_domain_name
Refer to the AWS documentation here
Solution 7 - Amazon S3
I recently came across this issue and the root cause seems to be that object versioning was enabled. After disabling versioning on the bucket the index HTML was served as expected.
Solution 8 - Amazon S3
I had the same problem when uploading to an S3 static site from NodeJS. As others have mentioned, the issue was caused by missing the content-type
when uploading the file. When using the web interface, the content-type
is automatically applied for you; however, when manually uploading you will need to specify it. List of S3 Content Types.
In NodeJS, you can attach the content type like so:
const { extname } = require('path');
const { createReadStream } = require('fs');
// add more types as needed
const getMimeType = ext => {
switch (ext) {
case '.js':
return 'application/javascript';
case '.html':
return 'text/html';
case '.txt':
return 'text/plain';
case '.json':
return 'application/json';
case '.ico':
return 'image/x-icon';
case '.svg':
return 'image/svg+xml';
case '.css':
return 'text/css'
case '.jpg':
case '.jpeg':
return 'image/jpeg';
case '.png':
return 'image/png';
case 'webp':
return 'image/webp';
case 'map':
return 'binary/octet-stream'
default:
return 'application/octet-stream'
}
};
(async() => {
const file = './index.html';
const params = {
Bucket: 'myBucket',
Key: file,
Body: createReadStream(file),
ContentType: getMimeType(extname(file)),
};
await s3.putObject(params).promise();
})();
Solution 9 - Amazon S3
If you are using AWS S3 Bitbucket Pipelines Python, then add the parameter content_type as follow:
s3_upload.py
def upload_to_s3(bucket, artefact, bucket_key, content_type):
...
def main():
...
parser.add_argument("content_type", help="Content Type File")
...
if not upload_to_s3(args.bucket, args.artefact, args.bucket_key, args.content_type):
and modify bitbucket-pipelines.yml as follow:
...
- python s3_upload.py bucket_name file key content_type
...
Where content_type param can be one of following: MIME types (IANA media types)
Solution 10 - Amazon S3
I've been through the same issue and I have resolved this way. At S3 Bucket, click o index.html checkbox, click con Actions tab, Edit Metadata, and you will notice that in Metadata options says "Type: System defined, Key: Content-Type, Value: binary/octet-stream". Change Value and put "html" and save the changes. Then click at index.html, "Open" button. That worked for me.