Sails: Sails Memory Leak or Keep increasing

Created on 21 Jul 2015  路  83Comments  路  Source: balderdashy/sails

I have created a new sails js app and did not change any code in it.

Than I used pm2 to launch two instance of that app. Memory keep increasing without hitting app even once.

I have done the same with basic express.js app with following code but it does not increase memory

"
var express = require('express');
var app = express();

app.get('/', function (req, res) {
res.send('Hello World!');
});

var server = app.listen(3000, function () {
var host = server.address().address;
var port = server.address().port;

console.log('Example app listening at http://%s:%s', host, port);
});
"

screenshot from 2015-07-21 14 01 13

question

All 83 comments

Interesting. Is it running migrations?

It is not doing anything. All I did is
sails new demo
cd demo
pm2 start app.js -i 2 --name "sailsApp"

And now it was increased further, I have not touched or used that app at all. Why the memory would increase.
screenshot from 2015-07-21 14 45 10

And memory has increased to:

screenshot from 2015-07-21 17 44 16

Without using it at all.

Can someone please help

Approximately how long does it take for the memory to grow to that level? Are we talking minutes, hours, or days?

Does this only happen when in cluster mode? If not, what happens if you start by command line without pm2?

Above levels took about 2 - 3 hrs to grow but if you keep looking at it, you can see memory growing in MBs.

And following is the picture after leaving it overnight.

screenshot from 2015-07-22 09 11 44

I will test forked mode now to compare if I am getting same results or not.

And also will run it without pm2.

@shumailarshad Thank you for going the extra mile! This is important information.

After testing fork mode for 3 hours, I found out that the problem is only cluster mode. It is some problem with sails js as express working with pm2 in cluster mode absolutely fine.

screenshot from 2015-07-22 11 55 11

Maybe the grunt hook gets crazy in cluster mode? Try to start the sails app in cluster mode disabling the grunt hook, in the .sailsrc file, and take a look.

@Josebaseba read my mind, I was wondering the same thing. :+1:

How to disable grunt hook. Can you please help? Or may be how to disable sails modules which I will not be using at all.

Change the .sailsrc file to:

{
  "hooks": {
    "grunt": false
  }
}

I have disabled grunt and tried again but it does not help.

screenshot from 2015-07-23 11 18 13

This is after running app for about an hour. Memory used keep increasing.

That's an important bug then, but I don't have any idea what could be the reason. It looks like an infinite loop, maybe we could have more info starting the app in silly or verbose log mode, and looking the logs.

Following are the logs when I run sails app with --silly and pm2 cluster mode. There has been some extra logging which might be the root cause. Can anyone please have a look at it.

sailsApp-out.0.log
"""""""""

verbose: Setting Node environment...
verbose: moduleloader hook loaded successfully.
verbose: Loading app config...
verbose: userconfig hook loaded successfully.
verbose: Exposing global variables... (you can disable this by modifying the properties in sails.config.globals)
verbose: Loading user hooks...
verbose: userhooks hook loaded successfully.
silly: Configured view engine, ejs
silly: Loading hook: logger
silly: Loading hook: request
silly: Loading hook: orm
silly: Loading hook: views
silly: Loading hook: blueprints
silly: Loading hook: responses
verbose: Loading runtime custom response definitions...
silly: Loading hook: controllers
silly: Loading hook: sockets
silly: Loading hook: pubsub
silly: Loading hook: policies
silly: Loading hook: services
verbose: Loading app services...
silly: Loading hook: csrf
silly: Loading hook: cors
silly: Loading hook: i18n
silly: Loading hook: session
silly: Loading hook: http
verbose: logger hook loaded successfully.
verbose: request hook loaded successfully.
verbose: Loading the app's models and adapters...
verbose: Loading app models...
verbose: Loading app adapters...
silly: Building action for view: 403
silly: Building action for view: 404
silly: Building action for view: 500
silly: Building action for view: homepage
silly: Building action for view: layout
verbose: Loading blueprint middleware...
verbose: blueprints hook loaded successfully.
verbose: responses hook loaded successfully.
verbose: controllers hook loaded successfully.
verbose: Loading policy modules from app...
verbose: Finished loading policy middleware logic.
verbose: policies hook loaded successfully.
verbose: services hook loaded successfully.
verbose: csrf hook loaded successfully.
verbose: cors hook loaded successfully.
verbose: i18n hook loaded successfully.
verbose: session hook loaded successfully.
verbose: http hook loaded successfully.
verbose: Overriding ejs engine config with ejslocals to implement layout support...
verbose: Preparing socket.io...
verbose: sockets hook loaded successfully.
verbose: views hook loaded successfully.
verbose: Setting default Express view engine to ejs...
verbose: Starting ORM...
verbose: orm hook loaded successfully.
verbose: pubsub hook loaded successfully.
verbose: Built-in hooks are ready.
verbose: Instantiating registry...
verbose: Loading router...
silly: Binding route :: all /* (REQUEST HOOK: addMixins)
silly: Binding route :: all /* (RESPONSES HOOK: addResponseMethods)
silly: Binding route :: all /* (I18N HOOK: addLocalizationMethod)
silly: Binding route :: all /* (VIEWS HOOK: addResViewMethod)
verbose: Policy-controller bindings complete!
silly: Binding route :: /* (CSRF HOOK: CSRF)
silly: Binding route :: all /* (CORS HOOK: clearHeaders)
silly: Binding route :: /csrfToken (CORS HOOK: sendHeaders)
silly: Binding route :: options /csrftoken (CORS HOOK: preflight)
silly: Binding route :: /csrfToken (FUNCTION: csrfToken)
silly: Binding route :: / (FUNCTION: rememberViewId)
silly: Binding route :: / (FUNCTION: serveView)
silly: Binding route :: get /**getcookie (SOCKETS HOOK)
silly: Binding route :: get /csrfToken (FUNCTION: csrfToken)
verbose: All hooks were loaded successfully.
verbose: Starting app at /home/shumailarshad/nodeProjects/sailsApp...
verbose: Running the setup logic in sails.config.bootstrap(cb)...
info: 
info:  .-..-.
info: 
info:  Sails <| .-..-.
info:  v0.12.0-rc3 |\
info:  /|.\
info:  / || \
info:  ,' |' \
info:  .-'.-==|/_--'
info:  --'-------' info:  __---_**--**_---**_--**_---**_--___ info:  ____---**_--**_---**_--**_---**_--**_-__ info:  info: Server lifted in/home/shumailarshad/nodeProjects/sailsApp info: To see your app, visit http://localhost:1337 info: To shut down Sails, press <CTRL> + C at any time.   verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?) Generated a one-time-use cookie:sails.sid=s%3AtUAU7KVTXHSmFFNdpXU1hBAEcoS_DAjt.A%2FO91d3L2bcP2UNEVLya4mHvZkfzR5BrNl96CxIjUYsand saved it on the socket handshake. This will start this socket off with an empty session, i.e. (req.session === {}) That "anonymous" section will only last until the socket is disconnected unless you persist the session id in your database, or by setting the set-cookie response header for an HTTP request that you _know_ came from the same user (etc) Alternatively, just make sure the socket sends acookieheader or query param when it initially connects. verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?) Generated a one-time-use cookie:sails.sid=s%3Aj8uL6A3dddjggkakIYr1ep3cF7y8xT_t.uP9syaNPMiQjI33hNNiKuNP9XM8Hkpsg1CriYoSC7Qgand saved it on the socket handshake. This will start this socket off with an empty session, i.e. (req.session === {}) That "anonymous" section will only last until the socket is disconnected unless you persist the session id in your database, or by setting the set-cookie response header for an HTTP request that you _know_ came from the same user (etc) Alternatively, just make sure the socket sends acookieheader or query param when it initially connects. verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?) Generated a one-time-use cookie:sails.sid=s%3AjeC-ps0Uf7HcBfie6rRFilDTGJUQhgfm.dMkoe%2FZe2NqJEvuqT0APaGDcS1xQF8QBbuw9xSjeVsMand saved it on the socket handshake. This will start this socket off with an empty session, i.e. (req.session === {}) That "anonymous" section will only last until the socket is disconnected unless you persist the session id in your database, or by setting the set-cookie response header for an HTTP request that you _know_ came from the same user (etc) Alternatively, just make sure the socket sends acookie` header or query param when it initially connects.
verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?)
Generated a one-time-use cookie:sails.sid=s%3Afn2dDhcvQQU2OFK3AlULSSrdED0shTzc.5Wumr9U%2FuHqT2uLj%2BvGHL1A1sEQdMjccFDcV5ASoeCYand saved it on the socket handshake.

""""""""""""""


sailsApp-out-1.log:
""""""""""""""

verbose: Setting Node environment...
verbose: moduleloader hook loaded successfully.
verbose: Loading app config...
verbose: userconfig hook loaded successfully.
verbose: Exposing global variables... (you can disable this by modifying the properties in sails.config.globals)
verbose: Loading user hooks...
verbose: userhooks hook loaded successfully.
silly: Configured view engine, ejs
silly: Loading hook: logger
silly: Loading hook: request
silly: Loading hook: orm
silly: Loading hook: views
silly: Loading hook: blueprints
silly: Loading hook: responses
verbose: Loading runtime custom response definitions...
silly: Loading hook: controllers
silly: Loading hook: sockets
silly: Loading hook: pubsub
silly: Loading hook: policies
silly: Loading hook: services
verbose: Loading app services...
silly: Loading hook: csrf
silly: Loading hook: cors
silly: Loading hook: i18n
silly: Loading hook: session
silly: Loading hook: http
verbose: logger hook loaded successfully.
verbose: request hook loaded successfully.
verbose: Loading the app's models and adapters...
verbose: Loading app models...
verbose: Loading app adapters...
silly: Building action for view: 403
silly: Building action for view: 404
silly: Building action for view: 500
silly: Building action for view: homepage
silly: Building action for view: layout
verbose: Loading blueprint middleware...
verbose: blueprints hook loaded successfully.
verbose: responses hook loaded successfully.
verbose: controllers hook loaded successfully.
verbose: Loading policy modules from app...
verbose: Finished loading policy middleware logic.
verbose: policies hook loaded successfully.
verbose: services hook loaded successfully.
verbose: csrf hook loaded successfully.
verbose: cors hook loaded successfully.
verbose: i18n hook loaded successfully.
verbose: session hook loaded successfully.
verbose: http hook loaded successfully.
verbose: Overriding ejs engine config with ejslocals to implement layout support...
verbose: Preparing socket.io...
verbose: sockets hook loaded successfully.
verbose: views hook loaded successfully.
verbose: Setting default Express view engine to ejs...
verbose: Starting ORM...
verbose: orm hook loaded successfully.
verbose: pubsub hook loaded successfully.
verbose: Built-in hooks are ready.
verbose: Instantiating registry...
verbose: Loading router...
silly: Binding route :: all /* (REQUEST HOOK: addMixins)
silly: Binding route :: all /* (RESPONSES HOOK: addResponseMethods)
silly: Binding route :: all /* (I18N HOOK: addLocalizationMethod)
silly: Binding route :: all /* (VIEWS HOOK: addResViewMethod)
verbose: Policy-controller bindings complete!
silly: Binding route :: /* (CSRF HOOK: CSRF)
silly: Binding route :: all /* (CORS HOOK: clearHeaders)
silly: Binding route :: /csrfToken (CORS HOOK: sendHeaders)
silly: Binding route :: options /csrftoken (CORS HOOK: preflight)
silly: Binding route :: /csrfToken (FUNCTION: csrfToken)
silly: Binding route :: / (FUNCTION: rememberViewId)
silly: Binding route :: / (FUNCTION: serveView)
silly: Binding route :: get /**getcookie (SOCKETS HOOK)
silly: Binding route :: get /csrfToken (FUNCTION: csrfToken)
verbose: All hooks were loaded successfully.
verbose: Starting app at /home/shumailarshad/nodeProjects/sailsApp...
verbose: Running the setup logic in sails.config.bootstrap(cb)...
info: 
info:  .-..-.
info: 
info:  Sails <| .-..-.
info:  v0.12.0-rc3 |\
info:  /|.\
info:  / || \
info:  ,' |' \
info:  .-'.-==|/_--'
info:  --'-------' info:  __---_**--**_---**_--**_---**_--___ info:  ____---**_--**_---**_--**_---**_--**_-__ info:  info: Server lifted in/home/shumailarshad/nodeProjects/sailsApp info: To see your app, visit http://localhost:1337 info: To shut down Sails, press <CTRL> + C at any time.   verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?) Generated a one-time-use cookie:sails.sid=s%3AhZ37GCPEKTcQX_yWauJWIZdT-6ifP4-K.BV5Ae2aYePNF3xYWFyETAGYaQQDlOhG8%2BYLW0y7tfwQand saved it on the socket handshake. This will start this socket off with an empty session, i.e. (req.session === {}) That "anonymous" section will only last until the socket is disconnected unless you persist the session id in your database, or by setting the set-cookie response header for an HTTP request that you _know_ came from the same user (etc) Alternatively, just make sure the socket sends acookieheader or query param when it initially connects. verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?) Generated a one-time-use cookie:sails.sid=s%3AfUrwmeyu-0OQwxytJfZFDhXLOZ4dJOzE.krmwG1mMKkj9Bwb90QyQJc%2FkNqhuKMJI3UHZaEuwdh8and saved it on the socket handshake. This will start this socket off with an empty session, i.e. (req.session === {}) That "anonymous" section will only last until the socket is disconnected unless you persist the session id in your database, or by setting the set-cookie response header for an HTTP request that you _know_ came from the same user (etc) Alternatively, just make sure the socket sends acookieheader or query param when it initially connects. verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?) Generated a one-time-use cookie:sails.sid=s%3AzYFy363qGzLWc2VQvySDuRLCD__V0zpn.fkaSgJYvhBEIQuoZy%2F4ToP6bKK19qu%2BWA8Xnvzm8lN4and saved it on the socket handshake. This will start this socket off with an empty session, i.e. (req.session === {}) That "anonymous" section will only last until the socket is disconnected unless you persist the session id in your database, or by setting the set-cookie response header for an HTTP request that you _know_ came from the same user (etc) Alternatively, just make sure the socket sends acookie` header or query param when it initially connects.
verbose: Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?)
Generated a one-time-use cookie:sails.sid=s%3AjF2Ze-C0DmsfyO2hhtEdMcxbkXhsFyIS.yJInQlHK7%2BF41uITREYvit1hPzr06ec3F%2B5SbrgCb9Aand saved it on the socket handshake.

"""""""""""""

Looks like something is opening a websocket connection over and over. Wondering if:

a) a websocket client is attempting to connect; or
b) PM2 is trying to connect for some reason (less likely); or
c) Sails/Socket.io is interpreting the cluster connections as websocket connections (is this even possible?)

You said you're not even hitting the app, so that should rule out a) above.

Given the high amount of socket connections, the memory usage is probably coming from using sails-memory as the default socket/session adapter, so all this data is being stuffed in RAM rather than a data store of some kind.

Anyone else have a hunch on what could be going on?

It seems like the sockets sessions are saving and saving all the time in a loop, don't know why. Try disabling the sockets this time and check the memory (.sailsrc file again):

{
  "hooks": {
    "sockets": false,
    "pubsub": false
  }
}

Right, after applying above change, it is stable.

screenshot from 2015-07-23 17 04 26

Now, we need to know, how to fix this issue. Whats happening?

@shumailarshad do you have a browser window open, pointed at the http://localhost:1337? The default sails.io.js library attempts to reconnect sockets after Sails is lowered and relifted, which would account for your logs.

@sgress454 but 4GB of data on empty sessions is too much for just a browser connecting and disconnecting for few hours... And that just happens in cluster mode, I can't figure what's going there.

Not sure if this is the solution, but I found this thread over at PM2: https://github.com/Unitech/pm2/issues/81

Using the memory adapter for socket sessions when clustered could cause issues, since your session would exist in memory of one process but not the others. Obviously you'd want to use a data store like Redis to solve this, but in @shumailarshad's case, he hasn't gotten far enough to play around with sockets and may not even want them yet.

He did say that he wasn't even hitting the URL, so if that's true then sails.io.js file should not even be running and trying to connect. 4GB is still insane, though, but I wanted to mention the above issue in case that has something to do with it.

Actually, Sails apps using the latest version of sails-hook-sockets do make their own socket connection on the backend, so that in a clustered situation all of the Sails apps in the cluster can communicate over a message bus (see https://github.com/balderdashy/sails/issues/3008#issuecomment-111640710). So if the log only has one of those "Could not fetch session" messages per app in the cluster, that's expected. Of course it still doesn't explain the memory leak (or its growth over time).

I've done a few tests with express, sails and a basic http server : https://github.com/soyuka/nodejs-http-memtest. Sails seems to be leaking :

@soyuka take a look here: https://github.com/balderdashy/sails/issues/2779 probably you have that memory increase because of the session storage, sails save it in the RAM. To fix it disable the sessions or just use an adaptor as redis-session.

That's not the problem, the problem is that the memory increases in the cluster mode + sockets/pub-sub hooks, that's the weird stuff.

@Josebaseba What are you meaning by _session storage_? Can you quote docs or some code please?
I'm not familiar with sailsjs at all.
Anyway, I don't see why Sailsjs would require almost 1.6gb of ram only by spamming a single http route.

@soyuka take a look at this specific comment: https://github.com/balderdashy/sails/issues/2779#issuecomment-91488569

@Josebaseba I have read this comment but can't find docs on how I could disable sessions. I noticed the config/session but I don't want to use any adapter.

http://stackoverflow.com/questions/28015873/disable-some-built-in-functionality-in-sails-js/28017720#28017720 try this, but just disabling session hook, and then tell us your results. Thanks!

ok that's fine now:

Never mind, not the issue here, and I was just searching if it was a pm2 leak or not :).

Gl with this one!

@Josebaseba thanks for your help on this, definitely an important thing to keep in mind when doing stress tests!

@sgress454 maybe someone could document this?

@soyuka you can send a pull request adding the info here: https://github.com/balderdashy/sails-docs/blob/master/concepts/Deployment/FAQ.md#performance-benchmarks

Anybody know if pm2 does some sort of heartbeat or ping? If so, that would account for the memory growth, as each ping would create a new session.

As far as I now it's just forking a cluster and catching exit events to restart.
@jshkurti thoughts?

For the record I had to change those two things to get rid of sessions:

//.sailsrc
"hooks": {
  "session": false
}
//config/http.js
module.exports.http = {
  middleware: {
    order: [
      'startRequestTimer',
      'cookieParser',
      // 'session',
     // etc.
    ],
  },
}

By heartbeat/ping you mean hitting http://localhost:1337 to check if it is online ?

@jshkurti right

Then no, there is no such thing.

Anybody know if pm2 does some sort of heartbeat or ping?

yes it does, it uses a heartbeat mechanism of its own design. It just checks the process, though, and doesn't hit the web server.

Does anybody has a solution for this? I Have the same issiue with [email protected] [email protected] [email protected] (for session storage and socket storage) [email protected] cluser_mode
Memory usage increases a few kb every second without ever hitting the app.

@sgress454 I found the origin of the leak.

Look at this line in the sockets-hook: https://github.com/balderdashy/sails-hook-sockets/blob/master/lib/connect-sails-client.js#L30

When you lift sails with PM2 in the cluster mode the connection fails and reconnect_attemptand reconnecting events are triggered in a infinite loop:


  var socket = socketIOClient("http://localhost:" + app.config.port + "/sails", {multiplex: false});

  // These two events logs in a loop all the time, increasing the memory usage really fast...
  socket.on('reconnect_attempt', function(err){
    console.log('reconnect_attempt!', err);
  });

  socket.on('reconnecting', function(err){
    console.log('reconnecting!', err);
  });

Setting reconnection:false the memory leak stops, but that's not a solution.


  var socket = socketIOClient("http://localhost:" + app.config.port + "/sails", {multiplex: false, reconnection:false});

@Josebaseba thanks! Even with those events being continuously triggered, there shouldn't be a memory leak, but this is a huge help.

@sgress454 yes, I guess that the memory leak happens when it tries to reconnect over and over again.

For more detail, the error log:

{ [Error: xhr post error] type: 'TransportError', description: 400 }

Edit:

If you disable the 'polling' option in the transports array the leak is fixed. The defaults should look this way:

// config/sockets.js Instead of ['polling', 'websocket'] and the same config in the sails.io.js file Line 952
transports: ['websocket'] 

For now, a hot fix could be that.

Ah, I see. I'm able to reproduce the memory leak, but only when I have the transports list in config/sockets.js locked to just "websocket". This is something we suggest in the scaling doc to deal with environments that don't implement sticky sessions. The solution is just to make sure that when the "admin socket" is connecting, it uses the same transports config as the app does.

Try that patch on for size--it stops the connection errors and plugs the leak for me.

Now, it still doesn't address _why_ the connection errors result in a memory leak, but, one thing at a time!

PM2 doesn't support (yet) sticky-sessions: https://github.com/Unitech/pm2/issues/389, like Heroku. So to avoid the memory leak, caused by the failed connection loop, we have to set the transports to 'websocket', plus socket.io-redis. Check the docs for more detail: http://sailsjs.org/documentation/concepts/deployment/scaling

session

That's my conclusion to this issue, if I'm wrong please correct me!

@Josebaseba The patch I made to sails-hook-sockets should fix the issue with failed connections (and thus plug the memory leak) regardless of the transport settings in your Sails app. The bigger picture is that because PM2 doesn't support sticky sessions, _your app won't work as expected_ without setting transports to websocket, as described in the scaling doc. That's the reason to update the transports setting.

If something else were to cause a problem with the socket connection, we'd still see this problem, so it's still worth looking into why the connection retries spawn a memory leak. It may still have something to do with sessions being created, although on my test app I was using MongoDb as the session store, and I was still seeing memory growth.

I am experiencing a memory leak on my production application, and my application does use websockets.

Can this in theory be the root cause of my memory leak?

I know I'm not providing much information, please let me know what else I can provide.

Are you using redis for sockets and sessions? You have to use them or the sessions will be saved in your memory. If you are using PM2 remember disabling the 'polling' option in transports, and leaving only 'websocket' option.

With that configuration you won't have any problem.

I do use Redis for sessions however not for Sockets. Should I enable Redis for sockets as well?

Also I am not using PM2. Thanks

I have two Sails Apps in production, both running in cluster mode with PM2 and both using the mongo adapter for session storage, not sails-mongo because as far as I'm aware sails-mongo doesn't work for session storage.

One of the apps is causing the memory leak issue, I'm seeing lots of requests to /socket.io and failing. I don't use sockets though and have them disabled on both apps both I only see this issue on one of them which is very strange. The app I'm seeing it on was upgraded from Sails v0.10.5 where as the one that doesn't have the problem was a fresh install of v0.11. I've just upgraded connect-mongo as that was the only difference I could see between the two apps but that hasn't made a difference.

I don't understand why I'm still seeing requests to /socket.io when I have disabled sockets in my .sailsrc

"hooks": {
  "sockets": false,
  "pubsub": false
},

Is there anything else I should be checking?

@gilesbutler The requests are probably coming from clients who still have the sails.io.js script included in their HTML. If you disable sockets on the back end, you'll want to remove the connection logic on the front end as well! By default this is included in your app's layout.ejs file via the Grunt task, so you can fix the issue by just removing the sails.io.js file from your assets/js folder (or assets/js/dependencies, depending on the version), and/or removing its reference from the tasks/pipeline.js file that Grunt uses to determine which assets to automatically inject into your HTML.

@sgress454 thanks for that buddy, thought I'd got rid of sails.io.js but you were right, it was still in tasks/pipeline.js :+1:

Hi, in my case are having the same problem but if that need to use socket. Us for now happens in a single instance. Any idea to solve this temporarily?

I just realised that I didn't restart my app in cluster mode but only as a single instance. When I restart it using pm2 in cluster mode one of the processes calls /socket.io even with the sails.io.js removed and the hooks disabled, very strange behaviour.

We're running into the same issue after recently upgrading our Sails server. As part of this upgrade we've already made changes to accommodate for the suggestions above including:

  • Moving sockets into Redis versus memory.
  • Disabling the polling transport method in both config/sockets.js and the sails.io.js file.

However, we're experiencing a slow memory leak that after ~8 hours requires our Heroku app to be restarted. You can see from the screenshot below there is not a corresponding increase in requests however. In our case we cannot discontinue use of sockets entirely, so I'm mostly following this issue for visibility towards a fix and will contribute if I find anything in my own debugging attempts.

screen shot 2015-09-08 at 8 52 43 am

Hi @Ignigena! We are also using heroku. In your case status using heroku Redis?. We have been doing tests with pm2 and keymetrics. And it seems that sails is not who climbs the consumed. Thanks!

@AlejandroJL we're using a hosted Redis instance through Compose.io, haven't had a chance to do very in-depth testing yet so if you have any pointers on where to start based on your own findings that would be appreciated!

Had ugly memory leaks with my sails app, upgrading to node 4.0.0 fixed everything!

Go go go go! Thanks @jacqueslareau !

WOW!

captura de pantalla 2015-09-11 a las 16 01 34

More info:

nodejs 400

Here's mine. Still closely monitoring memory. Still going up a bit but seems better.

omega sails memory

@jacqueslareau @AlejandroJL I know this is a stupid question, but when you say that upgrading Node fixed the problem, you are taking into account the fact that restarting your Sails app will drop the memory usage to zero (or more accurately, the baseline amount of memory usage), right? In both of your graphs, the memory starts going up again.

Hi @sgress454 . The difference in my case is that now the memory goes freeing is little bit. Before it was not up to quite large quantities with a super traffic reduced (in testing phase). It is clear that it is not the solution but at least an improvement is seen. Now as we need to see the behavior of NodeJS 4.0.0. In the case of my chart if that took into account the reboot. And soon see ram is freed of little.

New graphics

captura de pantalla 2015-09-11 a las 18 33 27

@sgress454 I took that into account yes. Before 4.0.0, it didn't took long before the memory gets consumed. See this graph:

omega sails memory-more

While it's true that I may have been over excited, it still is a major improvement. Well keep posted.

@AlejandroJL @jacqueslareau I ran the same app on node 0.10.x with no memory leaks then ran it on 0.12.x and saw the leak. Haven't tried it on iojs or node 4.0 yet.

@particlebanana fwiw my experience is identical. Previously my app was running on 0.10 and never exceeded about 200MB or so of memory. Once upgrading my dyno to 0.12 the leak started. I've been able to mitigate this somewhat on Heroku by using the --max_old_space_size flag since I noticed all my heap dumps were similarly sized at only 40-50MB in size.

Going to attempt a jump to 4.0 and see if this changes things...

Hey @particlebanana we are currently experimenting with 4.0. And we see improvements. in the previous comments you have graphs with NodeJS 4.0

yes.. i have experienced memory leak for sails using with node 0.12.7 version even before and after turn off hooks. maybe will try nodejs 4.0. hope someone else can solve this issue. thanks.

Once upgrading my dyno to 0.12 the leak started

The explanation is as simple as you'd think it is: node 0.12 simply uses more RAM and doesn't garbage collect as aggressively.

Out of curiousity how are you folks getting sails to run with v4.0? Is there a workaround to the seg faults happening with socket.io?

don't think nodejs 4.0 will solve the memory after tried.

@kennysmoothx I'm not getting the same seg-faults on v4 both locally and deployed on Heroku. I did have to make some updates to the .travis.yml to make the build run smoothly there, but otherwise I've had no issues.

@tjwebb I do think my particular issue (and potentially others too) may be related to the garbage collection in Node. I was tipped off to this because my heap dumps were nearly identical in size both at deployment and once memory was pegged out several hours later. I had also already done the suggested steps of moving sessions and sockets off of disk storage.

Node v4 wasn't a magic bullet fix for me. My solution has been to set the --max_old_space_size flag at around 50-60% of available memory (in MB) on the deployed instance. I'm still seeing shark teeth in the memory graphs but I've at least stopped exceeding memory and experiencing the resulting performance degradation in production.

Here is a followup after 5 days. I observe the same thing on my other sails apps.

omega sails memory-suite

@jacqueslareau @Ignigena thanks for the follow up- and please keep us posted. Keeping a close eye on this.

Following up on this--are we prepared to chalk this up to a Node issue and close, or should we still be looking at a possible issue in Sails? @jacqueslareau @Ignigena have things continued to be stable?

Still no more memory leaks for me. But my observations are anecdotal at best. Maybe something else I didn't see played a role in my leaks, but I highly doubt it. It would require more debugging and tests.

Ok--closing for now. Thanks again for everyone's hard work on this.

@sgress454 :+1:

Hi all ,

I found this issue which i was facing on my staging and production server . Previously i was using node v 0.12.4 with passenger used for deployment and on fresh restart the memory consumption went from 13 to 14 % to 43% without any request and using loadtest to just hit the server url consumed more memory which i am yet to resolve ( may be its the socket and session that uses ram for storage ) . We then migrated to node v 4.0.24 which had better gc policy and cleared off the unused memory better than v 0.12.4 and i switched from passenger to PM2 which reduced another consumption in memory and now on fresh start up the consumption is just about 9 to 10 % and on full load it may increase to 14 to 18 % but not more than that.

If there is an issue with socket or session using RAM we would like to know another alternate solution so that we can optimize our server more.

@kailashyogeshwar85 Howdy. Please open a new issue. Thanks!

Best, Irl

@kailashyogeshwar85 check out https://github.com/balderdashy/sails/issues/3638#issuecomment-198583303 for more exploration of your question about sessions.

More generally, I wrote up some more info/tips/best practices for diagnosing suspected memory leaks in a Node/Sails application here. If you notice issues after running through those troubleshooting steps, please open a new issue to make sure we see it, and we'll look into it ASAP.

Thanks again!

An update with more info and tips:
https://github.com/balderdashy/sails/issues/3782#issuecomment-238679052

Was this page helpful?
0 / 5 - 0 ratings