Tfjs: TF.JS on Node.js Error: Argument 'x' passed to 'slice' must be a Tensor, but got object

Created on 22 Jun 2018  路  6Comments  路  Source: tensorflow/tfjs

TensorFlow.js version

"@tensorflow/tfjs": "^0.11.6",
"@tensorflow/tfjs-node": "^0.1.7"

Browser version

  • none

Node.js version

$ node --version
v8.11.2

Describe the problem or feature request

I'm trying to convert a Tensorflow.js (ie the tfjs package) model example to the Node.js version (ie. tfjs-node package).

My import are the following:

const tf = require('@tensorflow/tfjs');
require('@tensorflow/tfjs-node');
tf.setBackend('tensorflow');

This should be enough to load tfjs and tfjs-node bindings having tensorflow as default backend.
The code loads a pre-trained model built in shards (the tfjs model format), and it's as simple as:

var fs = require('fs');
var performance = require('perf_hooks').performance;
const model_path = 'file://' + __dirname + '/model/model.json';
const model_metadata = __dirname + '/model/metadata.json';
var text = 'this is a bad day';
tf.loadModel(model_path)
    .then(model => {
        let sentimentMetadata = JSON.parse(fs.readFileSync(model_metadata));
        //console.log(sentimentMetadata);
        let indexFrom = sentimentMetadata['index_from'];
        let maxLen = sentimentMetadata['max_len'];
        let wordIndex = sentimentMetadata['word_index'];
        console.log('indexFrom = ' + indexFrom);
        console.log('maxLen = ' + maxLen);
        console.log('model_type', sentimentMetadata['model_type']);
        console.log('vocabulary_size', sentimentMetadata['vocabulary_size']);
        console.log('max_len', sentimentMetadata['max_len']);
        const inputText =
            text.trim().toLowerCase().replace(/(\.|\,|\!)/g, '').split(/\s+/g); // tokenized
        // Look up word indices.
        const inputBuffer = tf.buffer([1, maxLen], 'float32');
        for (let i = 0; i < inputText.length; ++i) {
            const word = inputText[i];
            if (typeof wordIndex[word] == 'undefined') { // TODO(cais): Deal with OOV words.
                console.log(word, wordIndex[word]);
            }
            inputBuffer.set(wordIndex[word] + indexFrom, 0, i);
        }
        const input = inputBuffer.toTensor();
        console.log(text, "\n", input);
        const beginMs = performance.now();
        const predictOut = model.predict(inputBuffer);
        const score = predictOut.dataSync()[0];
        predictOut.dispose();
        const endMs = performance.now();
        console.log({ score: score, elapsed: (endMs - beginMs) });
    })
    .catch(error => {
        console.error(error)
    })

I get this error while running:

Error: Argument 'x' passed to 'slice' must be a Tensor, but got object.

that means that my input object is not a Tensor object instance, even if I can clearly see in the logs that I have

Tensor {
  isDisposedInternal: false,
  size: 100,
  shape: [ 1, 100 ],
  dtype: 'float32',
  strides: [ 100 ],
  dataId: {},
  id: 22,
  rankType: '2' }

a Tensor object instance when getting the tensor from the input buffer const input = inputBuffer.toTensor(); that converts a TensorflowBuffer to a Tensor object. This seems to not work properly in Node.js, while in the browser it works / or the assertion type check does not work as expected when in Node.js.

Code to reproduce the bug / link to feature request

Full code to reproduce the error: https://github.com/loretoparisi/tensorflow-node-examples/blob/master/sentiment/sentiment.js

Most helpful comment

Yes, you can use @tensorflow/tfjs-node-gpu!

All 6 comments

cc @caisq

According to TF.js comment of the forum here I have updated the example code, in order to use "@tensorflow/tfjs-node": "0.1.6" because Daniel has found a bug in tfjs-node 0.1.7 - https://github.com/tensorflow/tfjs/issues/471

I have then fixed the model.predict to use a TensorBuffer instead of the Tensor object.
In this way the code seems to fix this specific issue, but I'm getting another warning:

$ node sentiment.js 
2018-06-27 11:01:25.185649: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.2 AVX AVX2 FMA
(node:76671) Warning: N-API is an experimental feature and could change at any time.

============================
Hi there 馃憢. Looks like you are running TensorFlow.js in Node.js. To speed things up dramatically, install our node backend, which binds to TensorFlow C++, by running npm i @tensorflow/tfjs-node, or npm i @tensorflow/tfjs-node-gpu if you have CUDA. Then call require('tensorflow/tfjs-node'); (-gpu suffix for CUDA) at the start of your program. Visit https://github.com/tensorflow/tfjs-node for more details.
============================

indexFrom = 3
maxLen = 100
model_type cnn
vocabulary_size 20000
max_len 100
[ 'this', 'is', 'a', 'bad', 'day' ]
this is a bad day 
 Tensor {
  isDisposedInternal: false,
  size: 100,
  shape: [ 1, 100 ],
  dtype: 'float32',
  strides: [ 100 ],
  dataId: {},
  id: 22,
  rankType: '2' }
{ score: 0.018475884571671486, elapsed: 1594.1342370510101 }

That warning is to be expected!

@nsthorat ok so it is confirmed that I'm using Node.js so I can expect on linux to have the GPU working too, right?

Yes, you can use @tensorflow/tfjs-node-gpu!

@nsthorat thanks a lot then, that's crazy!!!! 馃挴 馃憤 馃

Was this page helpful?
0 / 5 - 0 ratings