I creating a chatbot with tensorfflow and node js.
This is source code : https://github.com/ran-j/ChatBotNodeJS/blob/master/routes/index.js#L184
When I try to predict I got this error: '(node:26952) UnhandledPromiseRejectionWarning: Error: Error when checking : expected dense_Dense1_input to have 2 dimension(s), but got array with shape [48]'
my tfjs version https://github.com/ran-j/ChatBotNodeJS/blob/master/package.json#L9
This looks like a shape error at prediction time. model.predict is expecting a batch of items to predict, even if there is only one element to predict. In your code, at line 66, it appears that your data to be predicted only has rank-1, but rank-2 is expected ( [batchSize, vocabSize]).
See API definition for model.predict here https://js.tensorflow.org/api/0.12.5/#tf.Model.predict
I dont get yet, how do I fix this ?
My xs tensor is: Tensor { isDisposedInternal: false, size: 1296, shape: [ 27, 48 ], dtype: 'float32', strides: [ 48 ], dataId: {}, id: 0, rankType: '2' }
So do I predict like this model.predict(tf.tensor(bow(sentence, true))), {batchSize: 27}).print(); ?
Try
const bowData = bow(sentence, true);
var data = tf.tensor2d(bowData, [1, bowData.length]);
//generate probabilities from the model
var results = model.predict(data)[0];
yes, worked, but when I try:
const bowData = bow(sentence, true);
var data = tf.tensor2d(bowData, [1, bowData.length]);
//generate probabilities from the model
var results = model.predict(data)[0];
console.log(results);
Nothing
But
const bowData = bow(sentence, true);
var data = tf.tensor2d(bowData, [1, bowData.length]);
//generate probabilities from the model
var results = model.predict(data);
console.log(results )
I get
Tensor {
isDisposedInternal: false,
size: 9,
shape: [ 1, 9 ],
dtype: 'float32',
strides: [ 9 ],
dataId: {},
id: 41540,
rankType: '2' }
And if
const bowData = bow(sentence, true);
var data = tf.tensor2d(bowData, [1, bowData.length]);
//generate probabilities from the model
var results = model.predict(data).print();
console.log(results);
I get
[[0.1065254, 0.0964221, 0.0483172, 0.0340803, 0.0725285, 0.0854334, 0.0703956, 0.057641, 0.0382466],]
undefined
my var results are getting undefined
Nice, looks like we are making progress! In general, tensors may only be
available on the GPU. The programmer needs to instruct TensorFlow to make
them available to the CPU before printing.
Rather than
console.log(results);
You should do one of the following:
results.print();
console.log(results.dataSync());
Sincerely,
Stan
On Fri, Aug 17, 2018 at 9:30 AM, Ranieri notifications@github.com wrote:
my var results are getting undefined
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/tensorflow/tfjs/issues/613#issuecomment-413866627,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAhZTs_hp-nzjQ7nC7xUJniQvR7UQZ1sks5uRsV6gaJpZM4V-wcc
.
--
Stan Bileschi Ph.D. | SWE | [email protected] | 617-230-8081
yes yes this works fine
var results = model.predict(data).dataSync()[0];
Thanksssssss
You're welcome! Happy hacking!
On Fri, Aug 17, 2018 at 11:11 AM, Ranieri notifications@github.com wrote:
yes yes this works fine
var results = model.predict(data).dataSync()[0];
Thanksssssss
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/tensorflow/tfjs/issues/613#issuecomment-413896871,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAhZTmWbBTwnMMfgOhhrCo7BcfECNaDrks5uRt0TgaJpZM4V-wcc
.
--
Stan Bileschi Ph.D. | SWE | [email protected] | 617-230-8081
Most helpful comment
yes yes this works fine
var results = model.predict(data).dataSync()[0];Thanksssssss