Is your feature request related to a problem? Please describe.
I saw that the S3 bucket for my project is getting rather large. I looked in there and it appears that all of my old builds are persisted in there.
Describe the solution you'd like
Is it possible to auto-purge old builds? A similar setup I’ve seen in other tools is something like “Only keep the 10 most recent builds”
Describe alternatives you've considered
I could manually delete old build files. I don’t know if this might cause issues within the AWS Amplify Console or some other Amplify-related tools.
I think this could be solved by adding a versioning question to the hosting configure. Then the build files could be uploaded without a hash.
@mrcoles are you aware of S3 lifecycle policies by the way? This can't be done with the CLI currently but you can just set a lifecycle policy to automatically delete or archive (to cheaper storage) based on upload timestamps.
@jkeys-ecg-nmsu thx for sharing! Do you know if anything is at risk in breaking in the Amplify console/website if I’m purging stuff too fast?
It makes more sense to me to keep the n most recent builds and delete the rest. I don’t see that as an option with S3 lifecycle policies? I threw together this node script which does that, which I could supposedly run as a cron: clean-amplify-s3-builds.js
Any thoughts? This feels like an overcomplicated solution to me.
Amplify team, please prioritize this - the files just keep growing!!!
Up
Also running on this problem! Always hitting the 100 buckets per project
Here is my +1 too, I would really like this to be implemented.
+111111
Most helpful comment
Amplify team, please prioritize this - the files just keep growing!!!