Tfjs: TypeError: NetworkError when attempting to fetch resource.

Created on 2 May 2018  ·  3Comments  ·  Source: tensorflow/tfjs

Hi, a bit new to the tensorflow.js landscape, but I'm trying to load a model.json file into my javascript file (currently testing this locally with a standalone html/js file) and it doesn't seem to be working. The model.json was created following the instructions here. I've attached 3 files (issue.html, model.json, group1-shard1of1) to reproduce the error.

Possible problems this could be related to:

TensorFlow.js version

0.10.3

Browser version

Firefox 59.0.2

Code to reproduce the bug (issue.html)

<html>
<head>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js" crossorigin=""></script>
</head>

<body>
<script>
async function loadModel() {
    console.log('Loading model');
    const model = await tf.loadModel('model.json');
    console.log('Model loaded');
    console.log(model);
    return model;
};
const model = loadModel();

// do other stuff

console.log(model);  //this line produces the error below

tensor = tf.randomNormal([1,256,256,1]);
model.predict(tensor).print();

</script>
</body>
</html>

Error message

Promise { "rejected" }​<state>: "rejected"​<reason>: TypeError​​columnNumber: 311907​​fileName: "https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js"​​lineNumber: 1​​message: "NetworkError when attempting to fetch resource."​​stack: "[97]</n.loadWeights/</</</<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:311907\n[97]</n.loadWeights/</</<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:311834\n[97]</n.loadWeights/</<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:311812\nr@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:310572\nn/<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:309909\n[97]</r</<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:309798\n[97]</r<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:309595\n[97]</n.loadWeights@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:311051\nr/</<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:520612\nr@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:522571\nn/<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:521908\n[125]</o</<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:521797\n[125]</o<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:521594\nr@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:520360\n[125]</n.loadModelInternal/</<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:523654\nr@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:522571\nn/<@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:521908\no@https://cdn.jsdelivr.net/npm/@tensorflow/[email protected]/dist/tf.min.js:1:521648\n"​​__proto__: Object { stack: "", … }​__proto__: PromiseProto { … } leaflet_tfjs.html:154:4

issue.zip

Most helpful comment

Hi @weiji14,

when running your code you should see the following error:
Fetch API cannot load [...]/model.json. URL scheme must be "http" or "https" for CORS request.

You need to serve your model via a http server that allows CORS request for loading it.

One simple way I found while developing with tfjs for serving a model is to use the http-server npm package:

$ wget https://github.com/tensorflow/tfjs/files/1965816/issue.zip
$ unzip issue.zip && cd issue
$ npm install http-server -g
$ http-server . --cors -o

The last line run a http server that allows CORS request on the current folder. -o open the browser.

You'll be able to load your model by browsing to http://localhost:8080/issue.html but you'll face another issue as your loadModel() returns a promise, so you can't call predict() on it.

One solution:

async function loadModel() {
    console.log('Loading model');
    const model = await tf.loadModel('model.json');
    console.log('Model loaded');
    console.log(model);
    return model;
};

loadModel()
    .then( (model) => {
        // do other stuff

        console.log(model);  //this line produces the error below

        tensor = tf.randomNormal([1,256,256,1]);
        model.predict(tensor).print();  
    });

Should return:

Loading model
issue.html:11 Model loaded
issue.html:12 t {_callHook: null, _addedWeightNames: Array(0), _stateful: false, id: 2, activityRegularizer: null, …}
issue.html:20 t {_callHook: null, _addedWeightNames: Array(0), _stateful: false, id: 2, activityRegularizer: null, …}
tf.min.js:1 Tensor
    [[[[-156.7172852],
       [39.8644524  ],
       [50.7873459  ],
       ...,
       [-84.6782303 ],
       [25.3184032  ],
       [-89.818985  ]],

      [[-58.4090843 ],
       [-144.4514313],
       [-316.0537415],
       ...,
       [-70.7861176 ],
       [33.7812767  ],
       [32.5913696  ]],
...

All 3 comments

Hi @weiji14,

when running your code you should see the following error:
Fetch API cannot load [...]/model.json. URL scheme must be "http" or "https" for CORS request.

You need to serve your model via a http server that allows CORS request for loading it.

One simple way I found while developing with tfjs for serving a model is to use the http-server npm package:

$ wget https://github.com/tensorflow/tfjs/files/1965816/issue.zip
$ unzip issue.zip && cd issue
$ npm install http-server -g
$ http-server . --cors -o

The last line run a http server that allows CORS request on the current folder. -o open the browser.

You'll be able to load your model by browsing to http://localhost:8080/issue.html but you'll face another issue as your loadModel() returns a promise, so you can't call predict() on it.

One solution:

async function loadModel() {
    console.log('Loading model');
    const model = await tf.loadModel('model.json');
    console.log('Model loaded');
    console.log(model);
    return model;
};

loadModel()
    .then( (model) => {
        // do other stuff

        console.log(model);  //this line produces the error below

        tensor = tf.randomNormal([1,256,256,1]);
        model.predict(tensor).print();  
    });

Should return:

Loading model
issue.html:11 Model loaded
issue.html:12 t {_callHook: null, _addedWeightNames: Array(0), _stateful: false, id: 2, activityRegularizer: null, …}
issue.html:20 t {_callHook: null, _addedWeightNames: Array(0), _stateful: false, id: 2, activityRegularizer: null, …}
tf.min.js:1 Tensor
    [[[[-156.7172852],
       [39.8644524  ],
       [50.7873459  ],
       ...,
       [-84.6782303 ],
       [25.3184032  ],
       [-89.818985  ]],

      [[-58.4090843 ],
       [-144.4514313],
       [-316.0537415],
       ...,
       [-70.7861176 ],
       [33.7812767  ],
       [32.5913696  ]],
...

Hi @timotheebernard . You're right, I tried serving it up online and it just works. I'll try to use your http-server trick, and thanks on how to handle the promise! I come from a Python background so it takes a bit of effort to get used to all these asynchronous things.

Hi @weiji14, you're welcome! Don't hesitate to close the issue if it's good for you!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

chrisdonahue picture chrisdonahue  ·  3Comments

nsthorat picture nsthorat  ·  3Comments

rumschuettel picture rumschuettel  ·  3Comments

take-kuma picture take-kuma  ·  3Comments

ritikrishu picture ritikrishu  ·  4Comments