To get help from the community, check out our Google group.
tfjs 0.11.7
tfjs-node 0.1.7
node v8.7.0
Got the following error when trying to load local weights file:
(node:73865) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 2): ReferenceError: fetch is not defined
const fs = require('fs');
const tf = require('@tensorflow/tfjs');
require('@tensorflow/tfjs-node');
(async function(){
let manifest = JSON.parse(fs.readFileSync('weights.json'));
let weights = await tf.io.loadWeights(manifest, 'file://./weights');
})();
@ubertao Why do you need to load weights individually (i.e., separately from the model architecture)? Can you load the entire model using tf.loadModel(), then get the weights from the model using Model.getWeights()?
@caisq The model was trained somewhere else, so far I only have access to the weights.
+1 on this. I cannot write tests for my models without being able to load the weights from a local file
I've had the same issue when using a library which loads the weights, rather than the full model, using that API.
Looks like the library uses an inline IO handler, which delegates to fetch, rather than using the IO Router?
@chrisdonahue @ubertao I did manage to workaround this by stubbing a fetch global which reads from the filesystem
global.fetch = async (file) => {
return { arrayBuffer: () => fs.readFileSync(file) }
}
@jthomas That work around shouldn't be necessary. Are you importing (requiring) both @tensorflow/tfjs and @tensorflow/tfjs-node? If the latter is not imported, the module that handles the native file system won't be available.
@caisq Unless I've missing something in the code, it didn't seem to me that the tf.io.loadWeights method uses the IOHandler system to read files?
Let me give you the background, I'm using this library (face-api.js) which loads weights directly using the same snippet as this issue.
const manifest = await (await fetch(manifestUri)).json()
return tf.io.loadWeights(manifest, modelBaseUri)
Looking at the tf.io.loadWeights code, it reads the manifest for all the paths and then calls loadWeightsAsArrayBuffer. This inline function uses fetch to retrieve the URLs directly, rather than going through the IOHandler, which causes the issue above?
export async function loadWeightsAsArrayBuffer(
fetchURLs: string[], requestOptions?: RequestInit): Promise<ArrayBuffer[]> {
// Create the requests for all of the weights in parallel.
const requests = fetchURLs.map(fetchURL => fetch(fetchURL, requestOptions));
const responses = await Promise.all(requests);
const buffers =
await Promise.all(responses.map(response => response.arrayBuffer()));
return buffers;
}
The @tensorflow/tfjs-node module doesn't appear to overwrite or extend these functions. Have I missed something in the code?
James,
That's correct - tf.io.loadWeights doesn't use IOHandlers under the hood.
It works only for http-based cases because it uses fetch.
What are you trying to achieve? Depending on the answer, I may be able to
suggest solutions.
Best,
Shanqing
On Wed, Aug 22, 2018 at 12:37 PM James Thomas notifications@github.com
wrote:
@caisq https://github.com/caisq Unless I've missing something in the
code, it didn't seem to me that the tf.io.loadWeights method uses the
IOHandler system to read files?Let me give you the background, I'm using this library (face-api.js
https://github.com/justadudewhohacks/face-api.js) which loads weights
https://github.com/justadudewhohacks/face-api.js/blob/47a54b57b7b376cba157824f2b8d7f6cdf9bc2b3/src/commons/loadWeightMap.ts#L44-L46
directly using the same snippet as this issue.const manifest = await (await fetch(manifestUri)).json()
return tf.io.loadWeights(manifest, modelBaseUri)Looking at the tf.io.loadWeights code, it reads
https://github.com/tensorflow/tfjs-core/blob/master/src/io/weights_loader.ts#L130-L138
the manifest for all the paths and then calls loadWeightsAsArrayBuffer.
This inline function uses fetch to retrieve the URLs directly, rather than
going through the IOHandler, which causes the issue above?export async function loadWeightsAsArrayBuffer(
fetchURLs: string[], requestOptions?: RequestInit): Promise// Create the requests for all of the weights in parallel.
const requests = fetchURLs.map(fetchURL => fetch(fetchURL, requestOptions));
const responses = await Promise.all(requests);
const buffers =
await Promise.all(responses.map(response => response.arrayBuffer()));
return buffers;
}The @tensorflow/tfjs-node module doesn't appear to overwrite or extend
these functions. Have I missed something in the code?—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/tensorflow/tfjs/issues/489#issuecomment-415096995,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AQC5fmuY3ZsFsrxRmS49kToITJ5mpDmcks5uTYjLgaJpZM4VByIK
.
Shanqing Cai
Software Engineer
Google
[email protected]
I can't speak for James but my use case is that I want to be able to write tests for my model which load model weights from a locally-stored manifest.json and compare to known input/output pairs from Python.
Currently I have an extremely undesirable workaround where I save the output of tf.io.loadWeights to a JSON file in the browser. Then, I load that JSON file in the test and pass the result to my model, where it would otherwise expect the result of tf.io.loadWeights. This shouldn't be necessary and the JSON files are huge because floating point values are serialized as strings.
Local testing of models has already helped me identify very subtle failure cases and should be supported.
This is now fixed in the latest release (0.2.1) of @tensorflow/tfjs-node and @tensorflow/tfjs-node-gpu