Using the Delete all content feature in Labs times out with a collection exceeding 3400 stories. This is not a limit I have explored, but the current situation I am in.
Import 3400+ stories in to ghost, try to delete all content.
I had to batch this import as the 5.5Mb file times out also, but that is probably due to my connection speed.
Any other info e.g. Why do you consider this to be a bug? What did you expect to happen instead?
I expected the request not to timeout.
This is a development environment, and I would not expect this operation to be carried out under normal production circumstances. Of course I can nuke the instance and start again with no issues.
Just though I would let you know of the issue.
Hey @Bouncey :wave: Was able to reproduce the timeout with just over 1k of posts. Not marking it as a bug, as the content gets removed correctly but we definitely could use some help to make this process snappier :+1:
This could probably be done using the same abstract polling class (and very similar implementation) that's being done in #10340
@gargol completely understand you not marking it as a bug. I'm glad you managed to reproduce this issue馃憤
Keep up the great work Ghost people!
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Not stale
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
This could probably be done using the same abstract polling class (and very similar implementation) that's being done in #10340