I'm using the latest version of tfjs-node on npm:
{
"peerDependencies": {
"@tensorflow/tfjs-core": "^2.4.0"
},
"dependencies": {
"@tensorflow/tfjs-converter": "^2.4.0",
"@tensorflow/tfjs-node": "^2.4.0"
}
}
I get this error when loading a saved model with loadSavedModel
TypeError: Cannot read property 'backend' of undefined
at Engine.moveData (/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3280:31)
at DataStorage.get (/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:115:28)
at NodeJSKernelBackend.getInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:153:43)
at NodeJSKernelBackend.getMappedInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1487:30)
at NodeJSKernelBackend.runSavedModel (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1506:66)
at TFSavedModel.predict (/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:362:52)
at /lib/tests/models/audio.js:44:22
const tf = require('@tensorflow/tfjs-node');
(function () {
const modelPath='/root/saved_model/';
// load model
tf.node.loadSavedModel(modelPath)
.then(model => {
// it holds a waveform of audio file
const data = require('fs').readFileSync('/root/test.json');
const waveform = JSON.parse(data).data;
const inputTensor = tf.tensor2d(waveform, [ waveform.length, 1], 'float32' );
const inputs = {
audio_id: '',
mix_spectrogram: null,
mix_stft: null,
waveform: inputTensor
};
return model.predict(inputs);
})
.then(output => {
console.dir(output, { depth: null, maxArrayLength: null });
})
.catch(error => {
console.error(error);
})
// load model metadata
tf.node.getMetaGraphsFromSavedModel(modelPath)
.then(modelInfo => {
console.dir(modelInfo[0].signatureDefs.serving_default.outputs, { depth: null, maxArrayLength: null });
console.dir(modelInfo[0].signatureDefs.serving_default.inputs, { depth: null, maxArrayLength: null });
})
.catch(error => {
console.error(error);
})
}).call(this);
i see you are using version 2.4 , can you try latest version 2.7 and let us know ?
@rthadur thanks I have updated to
"peerDependencies": {
"@tensorflow/tfjs-core": "^2.7.0"
},
"dependencies": {
"@tensorflow/tfjs-converter": "^2.7.0",
"@tensorflow/tfjs-node": "^2.7.0"
}
}
and I get exactly the same error. I have also verified that the installed version was the 2.7.0:

Can you please check if this issue is related.
cc @tafsiri
Not sure it's related, because I'm not consuming data (as in the example), so there should be no problem of yelding those tensor data, but it's my intuition, maybe it's not correct. In the meantime I digged a bit into the offending code here
Engine.prototype.moveData = function (backend, dataId) {
console.log(backend,dataId)
var info = this.state.tensorInfo.get(dataId);
var srcBackend = info.backend; // <--- line 3820, TypeError: Cannot read property 'backend' of undefined
var values = this.readSync(dataId);
and dumped the backend and dataId:
NodeJSKernelBackend {
binding: {},
isGPUPackage: false,
isUsingGpuDevice: false,
tensorMap: DataStorage {
backend: [Circular],
dataMover: Engine {
ENV: [Environment],
registry: [Object],
registryFactory: [Object],
pendingBackendInitId: 0,
state: [EngineState],
backendName: 'tensorflow',
backendInstance: [Circular],
profiler: [Profiler]
},
data: WeakMap { <items unknown> },
dataIdsCount: 1
}
} undefined
Hope this helps!
More help. Looking at the model getMetaGraphsFromSavedModel input modelInfo[0].signatureDefs.serving_default.inputs
The shape of the waveform input is the following:
{
dtype: 'float32',
tfDtype: 'DT_FLOAT',
name: 'Placeholder:0',
shape: [
{
wrappers_: null,
messageId_: undefined,
arrayIndexOffset_: -1,
array: [ -1 ],
pivot_: 1.7976931348623157e+308,
convertedPrimitiveFields_: {}
},
{
wrappers_: null,
messageId_: undefined,
arrayIndexOffset_: -1,
array: [ 2 ],
pivot_: 1.7976931348623157e+308,
convertedPrimitiveFields_: {}
}
]
}
The two issues are related though there may be different paths that produce the same behaviour. @loretoparisi are you able to share the model you are using with us?
@tafsiri Yes you can download the saved model from here
Hi @tafsiri do you need any additional info to check this issue? Thanks a lot!
Not yet, just got back from a long holiday in the US so will be able to take a closer look this week.
So the cause of the error you are seeing is that you are passing null values as inputs to the model. Your code has
const inputs = {
audio_id: '',
mix_spectrogram: null,
mix_stft: null,
waveform: inputTensor
};
All of those inputs need to be tensors. Here is the code i used to get past that point (just creating some random data):
tf.node.loadSavedModel(modelPath, ['serve'], 'serving_default')
.then(model => {
const inputs = {
audio_id: tf.tensor(['id']),
mix_spectrogram: tf.randomNormal([2, 512, 1024, 2]),
mix_stft: tf.randomNormal([2, 2049, 2]),
waveform: tf.randomNormal([2, 2])
};
return model.predict(inputs);
})
.then(output => {
console.dir(output, { depth: null, maxArrayLength: null });
})
.catch(error => {
console.error(error);
})
Beyond that I still ran into an issue Error: Session fail to run with error: Placeholder_1:0 is both fed and fetched., and looking at some of the signature def info (see below), I noticed that audio_id is both an input and and output and is referring to the same placeholder. From what I can tell this isn't allowed, it appears one should at least call tf.identity on the input placeholder if you are trying to pass it through. Have you been able to execute this model succesfully in python (or otherwise know that it works in python)?
inputs {
audio_id: {
dtype: 'string',
tfDtype: 'DT_STRING',
name: 'Placeholder_1:0',
shape: []
},
mix_spectrogram: {
dtype: 'float32',
tfDtype: 'DT_FLOAT',
name: 'strided_slice_3:0',
shape: [ [Object], [Object], [Object], [Object] ]
},
mix_stft: {
dtype: 'complex64',
tfDtype: 'DT_COMPLEX64',
name: 'transpose_1:0',
shape: [ [Object], [Object], [Object] ]
},
waveform: {
dtype: 'float32',
tfDtype: 'DT_FLOAT',
name: 'Placeholder:0',
shape: [ [Object], [Object] ]
}
}
outputs {
accompaniment: {
dtype: 'float32',
tfDtype: 'DT_FLOAT',
name: 'strided_slice_23:0',
shape: [ [Object], [Object] ]
},
audio_id: {
dtype: 'string',
tfDtype: 'DT_STRING',
name: 'Placeholder_1:0',
shape: []
},
vocals: {
dtype: 'float32',
tfDtype: 'DT_FLOAT',
name: 'strided_slice_13:0',
shape: [ [Object], [Object] ]
}
}
@tafsiri thanks for your analysis. So yes the model in Python 3.7 and Tensorflow 1.15 works fine. Looking at the code, one of the mix_spectrogram, mix_stft, waveform is mandatory, so at least one among them - in the Python version - it is required (that's the audio source). I have then tried this way as you did:
const inputTensor = tf.tensor2d(waveform.data, [ waveform.data.length, 1], 'float32' );
const inputs = {
audio_id: tf.tensor(['id']),
mix_spectrogram: null,
mix_stft: null,
waveform: inputTensor
};
where waveform.data contains a wave with shape [88175, 2 ] of float32
but now I get
TypeError: Cannot read property 'dataId' of null
at NodeJSKernelBackend.getInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:153:58)
at NodeJSKernelBackend.getMappedInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1487:30)
at NodeJSKernelBackend.runSavedModel (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1506:66)
at TFSavedModel.predict (//node_modules/@tensorflow/tfjs-node/dist/saved_model.js:362:52)
so the error this time is here (@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js line 153) in function getInputTensorIds of the nodejs backend:
// Prepares Tensor instances for Op execution.
NodeJSKernelBackend.prototype.getInputTensorIds = function (tensors) {
console.log(tensors)
var ids = [];
for (var i = 0; i < tensors.length; i++) {
if (tensors[i] instanceof int64_tensors_1.Int64Scalar) {
// Then `tensors[i]` is a Int64Scalar, which we currently represent
// using an `Int32Array`.
var value = tensors[i].valueArray;
var id = this.binding.createTensor([], this.binding.TF_INT64, value);
ids.push(id);
}
else {
var info = this.tensorMap.get(tensors[i].dataId);
// TODO - what about ID in this case? Handle in write()??
if (info.values != null) {
// Values were delayed to write into the TensorHandle. Do that before
// Op execution and clear stored values.
info.id =
this.binding.createTensor(info.shape, info.dtype, info.values);
info.values = null;
}
ids.push(info.id);
}
}
return ids;
};
In my example the tensors list is the following
0 Tensor {
kept: false,
isDisposedInternal: false,
shape: [ 1 ],
dtype: 'string',
size: 1,
strides: [],
dataId: {},
id: 0,
rankType: '1'
}
1 null
and in fact we get a null for tensors[ 1 ].
So if I did as you tried
const inputTensor = tf.tensor2d(waveform.data, [ waveform.data.length, 1], 'float32' );
//const inputTensor = tf.tensor1d(waveform, 'float32')
const inputs = {
audio_id: tf.tensor(['id']),
mix_spectrogram: tf.randomNormal([2, 2]),
mix_stft: tf.randomNormal([2, 2]),
waveform: inputTensor
};
````
I come out with the `Error: Session fail to run with exactly the same error: Placeholder_1:0 is both fed and fetched.` even with a correct input (at least for `waveform`).
Question. Could it be this related how the model has been saved from the original checkpoint?
We have used [this](https://github.com/deezer/spleeter/blob/243b3236adaf0101b3c4ffd5ad37c2c4c731b04f/spleeter/utils/estimator.py#L75) script for converting that makes use of `tf.estimator.Estimator. export_saved_model `:
```python
def to_predictor(estimator, directory=DEFAULT_EXPORT_DIRECTORY):
""" Exports given estimator as predictor into the given directory
and returns associated tf.predictor instance.
:param estimator: Estimator to export.
:param directory: (Optional) path to write exported model into.
"""
def receiver():
shape = (None, estimator.params['n_channels'])
features = {
'waveform': tf.compat.v1.placeholder(tf.float32, shape=shape),
'audio_id': tf.compat.v1.placeholder(tf.string)}
return tf.estimator.export.ServingInputReceiver(features, features)
estimator.export_saved_model(directory, receiver)
versions = [
model for model in Path(directory).iterdir()
if model.is_dir() and 'temp' not in str(model)]
latest = str(sorted(versions)[-1])
return predictor.from_saved_model(latest)
I think the error is here tf.estimator.export.ServingInputReceiver(features, features) passing features as input, that is not correct I think since the signature is
tf.estimator.export.ServingInputReceiver(
features, receiver_tensors, receiver_tensors_alternatives=None
)
It is possible that tf.Estimator adds some preprocessing that is outside of the saved model (and which for example allows you to pass just one of 3 of the audio related inputs). We don't directly support estimator models in tfjs-node, there are some parts of that api that are not part of the c++ api used under the hood (they are just in the python layer). You might be able to modify tf.estimator.export.ServingInputReceiver(features, features) to tweak the savedModel to just have the features you plan to use, but I don't really know if that will work/is possible.
If you are able to execute the saved model in python _without_ instantiating an estimator instance, that may suggest a path to using this model with tfjs-node.
Also going to cc @pyu10055 who may know about estimator compatibility.
@tafsiri thank you, I appreciate it! So we did some step forward, using the checkpoint directly! @shoegazerstella is trying in this way
import os
import tensorflow as tf
trained_checkpoint_prefix = 'pretrained_models/2stems/model'
export_dir = os.path.join('export_dir', '0')
graph = tf.Graph()
with tf.compat.v1.Session(graph=graph) as sess:
loader = tf.compat.v1.train.import_meta_graph(trained_checkpoint_prefix + '.meta')
loader.restore(sess, trained_checkpoint_prefix)
builder = tf.compat.v1.saved_model.builder.SavedModelBuilder(export_dir)
builder.add_meta_graph_and_variables(sess,
[tf.saved_model.TRAINING, tf.saved_model.SERVING],
strip_default_attrs=True)
builder.save()
At this point it should be done, but we now get a Error: The SavedModel does not have signature: serving_default
2020-12-04 08:08:46.918697: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
Error: The SavedModel does not have signature: serving_default
at getSignatureDefEntryFromMetaGraphInfo (/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:210:23)
Thank you!
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you.