Ml-agents: Leaking

Created on 1 Jun 2018  路  5Comments  路  Source: Unity-Technologies/ml-agents

Hello everyone,

Don't know if there's a leak we're facing or we're just doing something wrong but hopefully it should be easy to figure out since we're not really doing much :) .

We're training a model to track a 3D object and we're exporting a .pb file which we then rename to .bytes. So far so good.

In Start() we're initializing the graph from a TextAsset:

    graph = new TFGraph();
    graph.Import(graphModel.bytes);

And in Update() we're doing inference:

        session = new TFSession(graph);
        var runner = session.GetRunner();
        TFTensor image_tensor = null;
        if (tfSetup.ExpectsFloatBuffer)
        {
            float[] byte_image = ImageToFloatBuffer(resized);
            image_tensor = TFTensor.FromBuffer(new TFShape(1, downsampled_size.x, downsampled_size.y, 3), byte_image, 0, byte_image.Length);
        }else
        {
            byte[] byte_image = ImageToByteBuffer(resized);
            image_tensor = TFTensor.FromBuffer(new TFShape(1, downsampled_size.x, downsampled_size.y, 3), byte_image, 0, byte_image.Length);
        }

        runner.AddInput(graph[tfSetup.InputNodeName][0], image_tensor);
        runner.Fetch(graph[tfSetup.OutputNodeName][0]);

        var results = runner.Run();
        var result = results[0].GetValue();
        session.CloseSession();
        session.DeleteSession();

Everything works fine but leaks a bit too much (about 500kb/s). The leaks are recorded after we call runner.Run(). If we do whatever we do just return before .Run() no leaks are recorded. If we return after session.DeleteSession() (because we also have some more code afterwards) we still leak.

The Unity profile shows that the leaks are not in Mono.

We are using this ML-Agents package : https://github.com/Unity-Technologies/ml-agents/pull/746

The problem can't be caused by the our trained model, could it?

Thank you for your time !!!

Most helpful comment

There is a bug in TFDisposable [1], which has been fixed last week [2].
Try to clone or download the most recent master [3], build it and then replace the TensorFlowSharp.dll with the newly built one (this worked for me).

Hope that helps!

[1] -- https://github.com/migueldeicaza/TensorFlowSharp/issues/270
[2] -- https://gist.github.com/alex-zu/ef94a19fd7282149295fdaccb2d1302b
[3] -- https://github.com/migueldeicaza/TensorFlowSharp

All 5 comments

There is a bug in TFDisposable [1], which has been fixed last week [2].
Try to clone or download the most recent master [3], build it and then replace the TensorFlowSharp.dll with the newly built one (this worked for me).

Hope that helps!

[1] -- https://github.com/migueldeicaza/TensorFlowSharp/issues/270
[2] -- https://gist.github.com/alex-zu/ef94a19fd7282149295fdaccb2d1302b
[3] -- https://github.com/migueldeicaza/TensorFlowSharp

Thanks for the very detailed reply. Unfortunately the patch kills Unity BUT we're not in a hurry so as long as we're not doing anything wrong and I can also see the issue was addressed, we'll just wait for 1.8.

Cheers!!!

@Ulizai You must manually dispose all input/output tensors, TFSharp does not tracking their, see bolded (wrapped by **) below:

        var runner = session.GetRunner();
        TFTensor image_tensor = null;
        if (tfSetup.ExpectsFloatBuffer)
        {
            float[] byte_image = ImageToFloatBuffer(resized);
            image_tensor = TFTensor.FromBuffer(new TFShape(1, downsampled_size.x, downsampled_size.y, 3), byte_image, 0, byte_image.Length);
        }else
        {
            byte[] byte_image = ImageToByteBuffer(resized);
            image_tensor = TFTensor.FromBuffer(new TFShape(1, downsampled_size.x, downsampled_size.y, 3), byte_image, 0, byte_image.Length);
        }

        **using (image_tensor)**
        {
            runner.AddInput(graph[tfSetup.InputNodeName][0], image_tensor);
            runner.Fetch(graph[tfSetup.OutputNodeName][0]);

            var results = runner.Run();

            **using (var resultTensor = results[0])**
            {
                var result = resultTensor.GetValue();
                ...
            }
        }

Thanks @alex-zu !!! That did it !!! No more leaking now.

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

Was this page helpful?
0 / 5 - 0 ratings