Error "Read-only file system" in AWS Lambda when downloading a file from S3

PythonAmazon S3Aws LambdaBoto3

Python Problem Overview


I'm seeing the below error from my lambda function when I drop a file.csv into an S3 bucket. The file is not large and I even added a 60 second sleep prior to opening the file for reading, but for some reason the file has the extra ".6CEdFe7C" appended to it. Why is that?

[Errno 30] Read-only file system: u'/file.csv.6CEdFe7C': IOError
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 75, in lambda_handler
s3.download_file(bucket, key, filepath)
File "/var/runtime/boto3/s3/inject.py", line 104, in download_file
extra_args=ExtraArgs, callback=Callback)
File "/var/runtime/boto3/s3/transfer.py", line 670, in download_file
extra_args, callback)
File "/var/runtime/boto3/s3/transfer.py", line 685, in _download_file
self._get_object(bucket, key, filename, extra_args, callback)
File "/var/runtime/boto3/s3/transfer.py", line 709, in _get_object
extra_args, callback)
File "/var/runtime/boto3/s3/transfer.py", line 723, in _do_get_object
with self._osutil.open(filename, 'wb') as f:
File "/var/runtime/boto3/s3/transfer.py", line 332, in open
return open(filename, mode)
IOError: [Errno 30] Read-only file system: u'/file.csv.6CEdFe7C'

Code:

def lambda_handler(event, context):

    s3_response = {}
    counter = 0
    event_records = event.get("Records", [])

    s3_items = []
    for event_record in event_records:
        if "s3" in event_record:
            bucket = event_record["s3"]["bucket"]["name"]
            key = event_record["s3"]["object"]["key"]
            filepath = '/' + key
            print(bucket)
            print(key)
            print(filepath)
            s3.download_file(bucket, key, filepath)

The result of the above is:

mytestbucket
file.csv
/file.csv
[Errno 30] Read-only file system: u'/file.csv.6CEdFe7C'

If the key/file is "file.csv", then why does the s3.download_file method try to download "file.csv.6CEdFe7C"? I'm guessing when the function is triggered, the file is file.csv.xxxxx but by the time it gets to line 75, the file is renamed to file.csv?

Python Solutions


Solution 1 - Python

Only /tmp seems to be writable in AWS Lambda.

Therefore this would work:

filepath = '/tmp/' + key

References:

Solution 2 - Python

According to http://boto3.readthedocs.io/en/latest/guide/s3-example-download-file.html

The example shows how to use the first parameter for the cloud name and the second parameter for the local path to be downloaded.

enter image description here

in other hand, the amazaon docs, says enter image description here

Thus, we have 512 MB for create files. Here is my code for me in lambda aws, for me works like charm.

.download_file(Key=nombre_archivo,Filename='/tmp/{}'.format(nuevo_nombre))

Solution 3 - Python

I noticed when I uploaded a code for lambda directly as a zip file I was able to write only to /tmp folder, but when uploaded code from S3 I was able to write to the project root folder too.

Solution 4 - Python

Also for C# works perfect :

using (var fileStream = File.Create("/tmp/" + fName))
{
   str.Seek(0, SeekOrigin.Begin);
   str.CopyTo(fileStream);
}

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questionuser1530318View Question on Stackoverflow
Solution 1 - Pythonjoonas.fiView Answer on Stackoverflow
Solution 2 - PythonEgaliciaView Answer on Stackoverflow
Solution 3 - PythonPavlo KovalchukView Answer on Stackoverflow
Solution 4 - PythonNigrimmistView Answer on Stackoverflow