Documentation: Meta-Issue: Performance Testing

Created on 20 Sep 2018  路  5Comments  路  Source: Islandora/documentation

This is a meta-issue to track the progress of performance testing the CLAW stack. Please refer to this issue in any subsequent issues to link them.

Islandora CLAW's architecture was designed to be scalable, but we have yet to set up a complex distributed install and see how it performs under heavy load. Statistics on performance can be tricky/slippery/misleading, so we'll want to isolate as many vectors as possible. Things to consider are

  • Performance between various installation setups (all-in-one vs. separate solr and fedora vs. load balanced microservices, etc..)
  • Performance handling high volumes of objects (ingest 1000 vs 10000 vs 1000000 objects)
  • Performance handling large files (what does a 1GB file do. 1TB?)
  • Performance degradation as # of fields increases (added as per @mjordan and @DiegoPino's comments)
Meta Issue Roadmap architecture documentation help wanted

All 5 comments

Should we also be load testing Drupal itself for number of nodes and and number of requests? Might be useful to have a baseline while testing the Islandora-specific components/configurations.

I agree Mark, IMHO the key is testing number of Nodes and Media entities based on Number of Content types, Media Types and attached Fields that Islandora provides/suggests, including taxonomy. Each Field is one DB table, so that piece matters

@mjordan @DiegoPino In some sense that would be tested with the "high number of objects" test, but testing # of fields is definitely a factor we should give careful consideration.

Edited the issue description to include that. Thanks guys.

In theory, we should be able to deploy to multiple machines just with modifying Ansible configurations. We need to this out as well. I suggest we add the following to the scope of this ticket:

  • Distributed installation/deployment testing
Was this page helpful?
0 / 5 - 0 ratings