Nuxt.js: Huge memory usage of nuxt-link

Created on 28 Sep 2017  Â·  23Comments  Â·  Source: nuxt/nuxt.js

Sorry for the general explanation of case, but I'm struggling with SSR leaks (probably in my code). The project has only bootstrap-vue and axios as modules, no other nuxt dependencies.

The actual code is (without it seems no leaks)

async asyncData ({ app, store }) {
    const [, fetchSecond] = await storeChecker(app,
      [
        {
          state: 'FirstData',
          check: current => current && (current.length > 0),
          save: (store, data) => store.commit('FirstData', Object.assign({}, data.data)),
          promiser: axios => axios.get('/FirstData')
        },
        {
          promiser: axios => axios.get('/SecondData?' + createFilter({
            filter: {
              limit: 20,
              order: 'createdAt desc',
              fields: ['name', 'id']
            }
          }))
        }
      ]
    )
    // next code is only grouping entities, tried without it with no luck
    const categories = convertCategoriesArrays([
      item => '/first/' + item.id,
      item => '/second/' + item.id
    ], store.state.FirstData)
    return {
      firstDataType1: categories[0],
      firstDataType2: categories[1],
      secondData: fetchSecond.data
    }
  }

and where storeChecker is defined as follows:

function storeChecker (app, promisesOptions) {
  var awaiters = []

  const axios = app.$axios
  const store = app.store

  for (let index in promisesOptions) {
    const option = promisesOptions[index]
    if (option.state) {
      if (option.check(store.state[option.state])) {
        awaiters.push(store.state[option.state])
      } else {
        awaiters.push(option.promiser(axios).then(data => option.save(store, data)))
      }
    } else {
      awaiters.push(option.promiser(axios))
    }
  }

  return Promise.all(awaiters)
}

Starting pm2's two clustered instances and benchmarking with different ApacheBench requests, such as

ab -n 100 -c 25 -r -k http://localhost:3000/

I see (using pm2 monitor or whatever) memory growth in linear manner. Profiling one instance and comparing the snapshots give me a picture, where context is copied every request... I have a difference between one and 20 request in 20 saved data strings in memory.

heapdumps

I have tried to avoid closures, putting all the code in asyncData method, but no changes in memory consumption are. Searching around the code for a few days didn't give me a hint.

Using the latest nuxt version (rc11). I guess that SSRContext is possibly copied via global between request, but cannot prove. The nearest line in profile, is the Vue.use, but it isn't a hint also.

Any thoughts are appreciated. This is a (my) real problem on server side, cause 50 request (10 concurrent) needs ~500Mb memory each time, and the memory usage summarize.

This question is available on Nuxt.js community (#c1572)
question

Most helpful comment

Update: found the issue to be unrelated to router-link, adding many router-links just increases the leak issue n-times. The culprit wat VeeValidate.

All 23 comments

Same problem!

helplessly i installed graceful-cluster to shut down leaking processes forcedly as workaround.

installed graceful-cluster to shut down leaking processes forcedly as workaround

yes, it may be the solution for a time. pm2 also restarts processes with --max-old-space-size exceeded for node instance, but with signal and laggy perfomance at the edge of limit. graceful-cluster is better in this case, thanks.

But I've not lost a hope to find out what is the root cause. Heap dumps points me to the regenerator-runtime package, and words in its code

// Rather than returning an object with a next method, we keep
// things simple and return the next function itself.

and I see many next()'s in dumps comparison... but what to do I don't know. Tried to change the babel presets with no effect. Probably this issue can enlighten what is happening... Investigating.

In addition - have tried to plug idle-gc module for forced gc... it does not collect. BTW, it would be nice if nuxt.config.js will have a 'startup()' method (as Meteor has).

Digging the issues of Vue SSR didn't help. Tried to reduce LRU cache size and age, but it's not a components cache since the memory space not reused (but accumulated each request). Also did some experiments on runInNewContext: 'once' with no luck.

Maybe, sometimes, I will find more than hour to profile and report back what's the heck.

Okay. Heaps comparison _possibly_ did the case. The problem is in the <nuxt-link/>. I have a very big catalog, which was rendered with nuxt (router-) links.

Accordingly with this, vue-router copies -- with SSR too -- the previous route.

When I have changed all nuxt-links to html anchors, additional memory usage returns back to initial start values. Possibly the question could be marked as issue. Also have noticed that event loop accelerates at three-four times at least as well as time of each request!

heaps

It seems that this issue https://github.com/vuejs/vue-router/issues/1279 enlighten what happens.

Will try to update if it is possible vue-router manually and report back. With reproduction or solution :)

Made a separate pages with different cases. Actually, here two separate moments -

  1. Large memory usage with nuxt-link (vue-router history mode on server side), but it doesn't leak,
  2. Combination of data fetch/creation promise, which next comes to nuxt-links in closure. It prevents coming back process memory size to initial values.

The repository here: https://github.com/AndruxaSazonov/leak-repro

i tried your repo and reproduced memory leak and high cpu usage.
so the <nuxt-link /> is the culprit! Great job!
you should change the title of this issue :)

but why??

but why??

Don't know. Router in nuxt contains reference to app, and rendering router-link may use some child data of context. This is a hypothesis, not more than.

Looking through the code of nuxt didn't help. Probably it is not the problem of nuxt, but the vue ssr's. No time for now to check plain ssr router-link.

One moment also - changing nuxt-link to router-link gives the same picture, so the thoughts as above.

More and more interesting!! Please check some additions in repro:
https://github.com/AndruxaSazonov/leak-repro/commit/119b1ef91d449c8c9ce8e1aa498fa360d5dc6eff

It seems that not a link actually or the rendering itself copies the context (if those copied of course). The problem when you nesting components!... :(

And why are $router available in included component...

Found a time to test on vue-ssr-boilerplate with latest dependencies set in packages.json.... better if I hadn't :/

The same history with the behaviours of router-link and html anchor in src/views/Home.vue tests... Anchors are 14 times faster.

This problem is the show-stopper anyways in my projects, since I have very big number of inner site links in each page (since it is a catalog).

Will use manual hooks on anchors as a workaround for time. And graceful-cluster too.

Thanks @AndruxaSazonov for your detailed description of this issue. Did you ever find a solution ?
Adding 600 router-links to my app brings it down very quickly.

Update: found the issue to be unrelated to router-link, adding many router-links just increases the leak issue n-times. The culprit wat VeeValidate.

I got the same issue,but no solution。

Finding memory leaks is one of the more difficult tasks when working with long running programs.
One of the best ways to see what type of objects are increasing over time is taking heap dumps as the original poster of this issue has done. There are many guides on how to do this, like this one: https://marmelab.com/blog/2018/04/03/how-to-track-and-fix-memory-leak-with-nodejs.html

In nuxt.js it will probably be an ever increasing number of Vue instances which might not shed much light on the core issue. My advice would be to first find a way to quickly test the issue.This can be done with a tool like apache benchmark (again as described in this issue) to send many requests per second to your server and watch memory consumption over time with for instance htop -p <your node process id>. Or use a node.js memory logging module from npm. Make sure you disable caching as this can prevent the leak from showing up. Next disable third party plugins, one at a time and see if the problem disappears. If not disable your own plugins or components.

I found this approach more practical than sifting through heap dumps. if you use git, of course first create a new branch as your modifications to the project can quickly increase.

Year ago this issue was fixed, and you are incorrect about the method.

Could you please explain how to fix this issue?

@AndruxaSazonov I've read most of your comments about the leak and steps you took to find it since now I am facing same problem in production..

I am using Kubernetes on Google Cloud and this is how my Deployment looks like
fireshot capture 186 - kubernetes engine - tepazari-main_ - https___console cloud google com_k

Note that it takes about 90 minutes from initial start to 4GB memory limit, then pod auto restarts etc..

What is your suggestion to find the memory leak on my case.. or how did you solve the issue.
I am using vuetify and I am not using axios-module

Cheers

@besnikh I think this is the reason of memory leak issue of vue-router

Do you use beforeRouteEnter at your component? This way will get infinite loop at function poll on vue-router.common.js .

I've found (year ago) that nuxt-links or router-links when they are too
many (I had 200) on the page producing that leak. Changing them to anchors
exposed the issue. But now I'm also using vuetify and latest nuxt, and
see no problems - having long uptime without restarts.

вт, 28 авг. 2018 г., 22:10 besnikh notifications@github.com:

@AndruxaSazonov https://github.com/AndruxaSazonov I've read most of
your comments about the leak and steps you took to find it since now I am
facing same problem in production..

I am using Kubernetes on Google Cloud and this is how my Deployment looks
like
[image: fireshot capture 186 - kubernetes engine - tepazari-main_ -
https___console cloud google com_k]
https://user-images.githubusercontent.com/25027019/44732012-bb832680-aae4-11e8-8adf-8b2aad65cd3e.png

Note that it takes about 90 minutes from initial start to 4GB memory
limit, then pod auto restarts etc..

What is your suggestion to find the memory leak on my case.. or how did
you solve the issue.
I am using vuetify and I am not using axios-module

Cheers

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/nuxt/nuxt.js/issues/1750#issuecomment-416623498, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ABZ34RMBFgxlKWls4j7mmlVmF0VbZsyVks5uVV1xgaJpZM4PnC2c
.

@ryouaki I use Vuetify which uses vue-router, but I do not use beforeRouteEnter .. I use an auth middleware, I guess that's the beforeRouteEnter ?
I do not use official auth middleware, but something I created using firebase auth.
Do you think this might be the problem ?

@AndruxaSazonov I am also using latest nuxt, latest vuetify but still my instances are going all the way up on memory usage then down.. It's not a issue now maybe, but after I get a lot of traffic :(

Are u still using Btw I also have a lot of links about 100...

I really don't know what to do :(

@besnikh yes , It is the problem from beforeRouteEnter.

When vue-router found that , user add a beforeRouteEnter function on an async component, it will invoke poll function, and set a timer callback every 16ms until the async compoent load success. sometimes async component will load failed, and then poll can not be break. I am not sure why this will happen.

I am investigating.

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

surmon-china picture surmon-china  Â·  3Comments

bimohxh picture bimohxh  Â·  3Comments

bimohxh picture bimohxh  Â·  3Comments

vadimsg picture vadimsg  Â·  3Comments

shyamchandranmec picture shyamchandranmec  Â·  3Comments