Here's my setup
AwsSession = Session(aws_access_key_id = 'ACCESS_ID', aws_secret_access_key = 'ACCESS_SECRET')
S3 = AwsSession.resource("s3")
And here's where I'm trying to put an object and give it an expiration (in datetime format) of 5 minutes from now
now = datetime.datetime.now()
expires = now + datetime.timedelta(minutes=5)
S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
Running that code throws an error
Traceback (most recent call last):
File "aws_test.py", line 43, in <module>
Main()
File "aws_test.py", line 11, in __init__
self.main()
File "aws_test.py", line 34, in main
img = self.upload_image_datetime(f, "test_image_{}".format(t))
File "aws_test.py", line 19, in upload_image_datetime
self.S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
File "/usr/local/lib/python2.7/site-packages/boto3/resources/factory.py", line 344, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/boto3/resources/action.py", line 77, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 269, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 310, in _make_api_call
api_params, operation_model)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 351, in _convert_to_request_dict
api_params, operation_model)
File "/usr/local/lib/python2.7/site-packages/botocore/validate.py", line 275, in serialize_to_request
operation_model)
File "/usr/local/lib/python2.7/site-packages/botocore/serialize.py", line 400, in serialize_to_request
shape_members)
File "/usr/local/lib/python2.7/site-packages/botocore/serialize.py", line 478, in _partition_parameters
value = self._convert_header_value(shape, param_value)
File "/usr/local/lib/python2.7/site-packages/botocore/serialize.py", line 504, in _convert_header_value
datetime_obj = parse_timestamp(value)
File "/usr/local/lib/python2.7/site-packages/botocore/utils.py", line 317, in parse_timestamp
return dateutil.parser.parse(value)
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 1008, in parse
return DEFAULTPARSER.parse(timestr, **kwargs)
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 392, in parse
res = self._parse(timestr, **kwargs)
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 492, in _parse
l = _timelex.split(timestr) # Splits the timestr into tokens
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 174, in split
return list(cls(s))
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 171, in next
return self.__next__() # Python 2.x support
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 164, in __next__
token = self.get_token()
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 82, in get_token
nextchar = self.instream.read(1)
AttributeError: 'datetime.datetime' object has no attribute 'read'
Looks like we can't pass a datetime.datetime object for Expires. However, the documentation http://boto3.readthedocs.org/en/latest/reference/services/s3.html#S3.Client.put_object tells you to give it a datetime object (only with a date, though)
Expires (datetime) -- The date and time at which the object is no longer cacheable.
Now, I also tried passing Expires a string containing the ISO format of the datetime object.
now = datetime.datetime.now()
expires = now + datetime.timedelta(minutes=5)
expires = expires.isoformat()
S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
So, this _sort of_ works. It doesn't error out, but it also never expires on the amazon server. This makes me think maybe there's something I need to do on my end to make/let it actually expire (and be removed)
I know Amazon doesn't immediately remove expired files like that, but even going back to things I uploaded days ago (also given a 5 minute expiration period) I find that they still exist and have not expired.
So, I'm not really sure if this is a documentation problem, or a legitimate code problem. I'd really like to be able to upload a file which lasts only ~5 minutes. Any ideas?
Thanks for your time
EDIT:
Looks like giving it a negative timestamp (isoformat) doesn't do anything either
now = datetime.datetime.now()
expires = now - datetime.timedelta(minutes=30)
expires = expires.isoformat()
S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
That still goes through and shows up fine. Curious.
Also, I tried passing it a time.time() and it didn't error, but it also didn't expire the files.
Looking through the code, and it looks like everything I'm doing _is_ supported, leading me to think it's a problem with either timezones, or bucket settings. Or maybe just misunderstanding how Expires _actually_ works.
Interesting. I was not able to reproduce the issue using python 2.7.5. Here is what I tried:
import datetime
from boto3.session import Session
aws_session = Session()
s3 = aws_session.resource("s3")
bucket = 'mybucketfoo'
key = 'mykey'
body = b'foo'
now = datetime.datetime.now()
expires = now + datetime.timedelta(minutes=5)
response = s3.Bucket('mybucketfoo').put_object(
Key=key, Body=body, Expires=expires, ACL='public-read',
ContentType="image", ContentDisposition="inline")
Then to check that expires was applied, I used the CLI:
$ aws s3api head-object --bucket mybucketfoo --key mykey
{
"AcceptRanges": "bytes",
"ContentType": "image",
"LastModified": "Mon, 24 Aug 2015 22:34:51 GMT",
"ContentLength": 3,
"Expires": "Mon, 24 Aug 2015 15:39:49 GMT",
"ETag": "\"acbd18db4cc2f85cedef654fccc4a4d8\"",
"ContentDisposition": "inline",
"Metadata": {}
}
Could you try running the code and see if you get the same issue? I also have a feeling it may be a dependency issue as the error is encountered in dateutil. Could you tell me what version of botocore, boto3, and python-dateutil you are using? I am running botocore 1.1.9, boto3 1.1.1, and python-dateutil 2.4.2.
Hey, @kyleknap
Thanks for the response.
I'm using botocore 1.1.5, boto3 1.1.1, and python-dateutil 2.4.2.
I've updated botocore now and I'm doing some tests again. Doesn't seem to error out when I give it a datetime object now. Quick 5 minute tests don't seem to be expiring.
Did your test uploads actually expire on the time that they were set to?
I'll do some longer tests and get back to you before tomorrow morning.
EDIT: Confirmed that there's actually a 'Expires' tag on the uploaded stuff
EDIT2: Looks like stuff still isn't expiring... Boto seems to be doing what it should do, but my stuff just isn't actually expiring.
I have the same issue, botocore v1.1.12, boto3 v1.1.2, dateutil v.2.4.2.
Here's what I get, no errors, but no Expiry Date in the header of the file on S3 (see screenshot below) but a metadata item called Expires. As far as I can tell, these objects do not actually expire (unless I'm not waiting long enough??).

They do have an Expires item as reported by the API, however:
>>> aws s3api head-object --bucket ageobot --key foo
{
"Expires": "Fri, 04 Sep 2015 15:45:54 GMT",
"Metadata": {},
"ContentLength": 5562296,
"LastModified": "Fri, 04 Sep 2015 14:45:55 GMT",
"ContentType": "binary/octet-stream",
"AcceptRanges": "bytes",
"ETag": "\"0ede703bbd635ddabfef1caa5cc1cb2c\""
}
Possibly related... I notice that the Last Modified date has a timezone (GMT-3, which is right), but the Expires key does not. So if it's 11 am, say, and I pass one hour from now as Expires, then I get a creation time of 11 am GMT-3, but an expiry time of 12 pm GMT, which is actually 2 hours _ago_ in my timezone. So the way I got the apparent 1-hour expiry in the example above was to compute the expiry datetime with 240 minutes.
Here's my code:
session = boto3.session.Session(aws_access_key_id=KEY,
aws_secret_access_key=SECRET,
region_name='us-east-1'
)
client = session.client('s3')
now = datetime.datetime.now()
expires = now + datetime.timedelta(minutes=240)
params = {'Body': databytes,
'Expires': expires,
'Bucket': bucket,
'Key': key,
'ACL': acl,
}
r = client.put_object(**params)
I have the exact same issue as @kwinkunks and @kyleknap. Using datetime object or formatted string doesn't help. The Expires Date field in S3 shows None whereas a new header shows up with the correct expiry date.
If anyone has found a solution please update this issue.
It seems that S3 does not support per object expiration.
The Expires header is not meant to set the TTL of the object. S3 documentations says "Expires = The date and time at which the object is no longer cacheable" (http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html)
To get your files automatically deleted by S3 you need to create Lifecycle rules on your bucket.
Oh, I suppose if that's true then this issue is less than helpful :)
I'll close it now.
Most helpful comment
It seems that S3 does not support per object expiration.
The
Expiresheader is not meant to set the TTL of the object. S3 documentations says "Expires = The date and time at which the object is no longer cacheable" (http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html)To get your files automatically deleted by S3 you need to create Lifecycle rules on your bucket.