Problem: project limit is 120 files and our project is already beyond that.
Solution: Paid versions could either increase the number of files or the size of the project in GB. This would make it more suitable for professional use and it would generate cash moneyy 馃挵
Really good one @TurboTobias! Also thanks a ton for becoming a patron 馃槃.
We will do this as soon as we've moved to Kubernetes, which should be within a month
Aaand... we finally moved to Google Kubernetes Engine! :smiley:
Is this still happening? Would love this for my team.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
@CompuIves, what is the status on this?
The limit has been raised a couple times now (it's now on 250 modules). I'm considering making the max files around ~1000 for patrons. It's a bit harder to implement than expected, because our database validation doesn't have a notion of a user when running the validations on file count (it's an abstraction that allows us to swap out files between sandboxes). I'll need to create the check somewhere else.
Have been a patron for long time. However, this is a painful issue for us. We have lot's of editable content (aka lessons) in the form of JSON files - and thus we hit the 250 limit easily (currently have about 300 files).
+1 with the use case of component library demos for documentation. If max files was around ~1000 for patrons we'd be good. Thank you!
GitPod is a good alternative for projects with a lot of files.
@ryanpcmcquen I appreciate the direction, but from what I can tell GitPod doesn't support embeds? That's a key feature as we're trying to publish embeds as part of library documentation.
@glue-eng-core, I don't know your use case, but you could flip the paradigm and have your docs run through GitPod, making everything editable/have anyone viewing their own instance of the docs.
We currently have a limit of 250 files, we're actually working on raising this limit to be size limited instead of count limited by refactoring how we save files. We already have the backend service for this, and are now working on refactoring the client to support multiple data sources. This would also mean that you can choose where to save to (like Dropbox).
I did try to naively raise the limit to 1000, but that makes forking quite slow (can take a literal minute). So let's get this refactor of the files done as soon as possible!
File limit is now 500 and should be noticeably faster.
This issue is stale because it has been open many days with no activity. It will be closed soon unless the stale label is removed or a comment is made.
This issue has been automatically closed because there wasn't any activity after the previous notice or the stale label wasn't removed.
Most helpful comment
We currently have a limit of 250 files, we're actually working on raising this limit to be size limited instead of count limited by refactoring how we save files. We already have the backend service for this, and are now working on refactoring the client to support multiple data sources. This would also mean that you can choose where to save to (like Dropbox).
I did try to naively raise the limit to 1000, but that makes forking quite slow (can take a literal minute). So let's get this refactor of the files done as soon as possible!