For a long time, we've wanted to be able to write end-to-end performance tests for AMP page rendering and interaction. This issue is meant to track the investigation of what technology to use, and following that, the implementation and deployment of these tests.
Ideally, these tests ought to be a part of continuous integration testing for PRs and push builds.
Based on https://github.com/travis-ci/travis-ci/issues/352 and https://github.com/travis-ci/travis-ci/issues/3313, there is no native support from Travis for performance tests, due to the lack of a guarantee that all tests are run on equally powerful VMs.
One option for a third party tool that integrates with Travis CI is here: https://loadimpact.com/integrations/performance-testing-travisci
Edit: Seems like this tool does a lot more than we really need. Sample test result here.
/cc @erwinmombay @cramforce @dvoytenko @choumx for comments on past efforts in this direction.
More options to consider:
browser-perf: https://github.com/axemclion/browser-perf
perfjankie: https://github.com/axemclion/perfjankie
phantomas: https://github.com/macbre/phantomas
Also important to note that we write these events to the performance timeline. That can be read using DevTools protocol or window.performance.getEntriesByType('mark') (From memory)
This issue hasn't been updated in awhile. @rsimha Do you have any updates?
This issue hasn't been updated in awhile. @rsimha Do you have any updates?
This issue hasn't been updated in awhile. @rsimha Do you have any updates?
This has been on the infra backlog for a while, but unfortunately, we don't yet have a solution.
@kristoferbaxter I recall you're having someone on your team tackle this during the summer. Moving to Performance. Let us know if there are any tooling needs -- happy to help.
Here's the plan to add gulp performance to our CI -
gulp performance as we go :)馃憢 @ampproject/wg-performance folks! Bumping this old issue because I'm build cop this week and noticed that the performance tests currently take ~30 minutes on Travis with ~20% of the tests failing.
Couple questions:
@rsimha i only started looking at this, how are the variances like on these numbers?
cc @kevinkimball
i only started looking at this, how are the variances like on these numbers?
The variance is low (performance tests regularly take ~30 mins), while the failure rate is high (20% failures).
if they are consistently failing, i'm all for skipping them.
@kevinkimball would it make sense to adjust or check against these performance tests per release?
i honestly have not looked at these at all during releases but with our drive on performance i wonder if we should give this more weight than we currently do.
I am working on figuring out what can be done to improve the stability of these tests. Will update by EOW
In addition to the issue with the images test failing consistently, it seems like the browser fails to launch occasionally (see example). Not sure if we need a separate issue to track that
Most helpful comment
Here's the plan to add
gulp performanceto our CI -gulp performanceas we go :)