Originally bought up in https://github.com/nuxt-community/i18n-module/issues/815 and related to https://github.com/nuxt/nuxt.js/issues/6467
It's not always obvious what makes up script evaluation times in a live deployment, or how much an initial performance impact comes from adding new modules. This is especially an issue if you are looking at your site through a non-interactive report generated by a perf tool like lighthouse or sitespeed.io.
It would be nice if the user timings api could be used to automatically add performance marks to all nuxt modules. That way it's super obvious the performance impact of each module.
Even if it's not technically possible to add marks automatically at the nuxt level because of all the different possibilities. It could at least be made standard practice in @nuxt and nuxt-community managed modules to add some level of user timing marks.
https://web.dev/user-timings/
https://developer.mozilla.org/en-US/docs/Web/API/User_Timing_API/Using_the_User_Timing_API
I guess you've made a mental shortcut but "modules" run at build time so it wouldn't be that important to time that.
What Nuxt can and should time is:
For plugins and middleware it would probably need to track from which modules those came (if they did) so that one could tell by looking at the metrics.
Most helpful comment
I guess you've made a mental shortcut but "modules" run at build time so it wouldn't be that important to time that.
What Nuxt can and should time is:
For plugins and middleware it would probably need to track from which modules those came (if they did) so that one could tell by looking at the metrics.