Azure-docs: Use model in tensorflow.js

Created on 14 Apr 2018  Â·  17Comments  Â·  Source: MicrosoftDocs/azure-docs

I hope someone could give a hint how to use the model in tensorflow.js. I suppose the .pb is a Frozen GraphDef? There is a converter tool (https://github.com/tensorflow/tfjs-converter) however I miss the "--output_node_names". Exporting directly for tensorflow.js would be even greater of course.


Document details

⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Pri2 assigned-to-author cognitive-servicesvc past-90-days product-question triaged

Most helpful comment

@anrothMSFT is there any progress on this issue?

All 17 comments

@kasperkamperman
Thanks for the feedback! We are currently investigating and will update you shortly.

@kasperkamperman
I have assigned the issue to the content author to investigate further and update the document as appropriate.

This is a good question. I don't actually think anyone on the team has tried to get this to play nice with TensorFlow.js yet..

I'm curious if there is any progress already?

I think there is no need of converting the model into frozen GraphDef because we can use .pb model directly with python

@ksujanpaul I don't want to use it with Python but with tensorflow.js... (which is javascript, so it's not related to Tensforflow for Python).

From my research I also understand that there are also different pb files, but I can't find which type Azure Custom Vision exports.

I found the tfjs converter. This is to convert a TensorFlow SavedModel (is the *.pb file from Azure a SavedModel?) or Keras model to a web-friendly format. However I need to fill in "output_node_names" (how do I get these?). I'm also not 100% sure if my pb file for Android is equal to a "tf_saved_model".

I hope someone has a tip or a starting point.

With tensorboard I could view the Graph, is the 'output_node_names' simply 'import'?

model_graph

@anrothMSFT is there any progress on this issue?

@anrothMSFT any updates on this?

Thank you for taking the time to share your product and documentation feedback with us. Your input is valued because it helps us create the right documentation for our customers. Due to the volume of issues in our queue, we are closing open issues older than 90 days. We hope to continue hearing from you. Thank you.

I hope someone could give a hint how to use the model in tensorflow.js. I suppose the .pb is a Frozen GraphDef? There is a converter tool (https://github.com/tensorflow/tfjs-converter) however I miss the "--output_node_names". Exporting directly for tensorflow.js would be even greater of course.

Document details

⚠ _Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking._

Thanks for bumping this. It has generated a hallway discussion this AM about whether to add TF.js export as an export flavor (we added TF lite last week, although release notes are lagging.) Something for us to keep discussing

@harishkrishnav on our team has been playing around with tf.js and might be able to provide some hints in the interim...

This is a bit tricky as is and we might soon release a sample for how to load Custom Vision models on the browser using TensorFlow.js. These are the steps I did to get an exported TensorFlow model working for me:

  1. Replace PadV2 operations with Pad. This python function should do it. input_filepath is the path to the .pb model file and output_filepath is the full path of the updated .pb file that will be created.
import tensorflow as tf
def ReplacePadV2(input_filepath, output_filepath):
    graph_def = tf.GraphDef()
    with open(input_filepath, 'rb') as f:
        graph_def.ParseFromString(f.read())

    for node in graph_def.node:
        if node.op == 'PadV2':
            node.op = 'Pad'
            del node.input[-1]
            print("Replaced PadV2 node: {}".format(node.name))

    with open(output_filepath, 'wb') as f:
        f.write(graph_def.SerializeToString())
  1. Install tensorflowjs 0.8.6 or earlier. Converting frozen models is deprecated in later versions.
  2. When calling the convertor, set --input_format as tf_frozen_model and set output_node_names as model_outputs. This is the command I used.
tensorflowjs_converter --input_format=tf_frozen_model --output_json=true --output_node_names='model_outputs' --saved_model_tags=serve  path\to\modified\model.pb  folder\to\save\converted\output

Ideally, tf.loadGraphModel('path/to/converted/model.json') should now work (tested for tfjs 1.0.0 and above).

@harishkrishnav It worked! Thank you so much!

@harishkrishnav Fantastic - do you want to take the SO answer?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Ponant picture Ponant  Â·  3Comments

jamesgallagher-ie picture jamesgallagher-ie  Â·  3Comments

varma31 picture varma31  Â·  3Comments

Agazoth picture Agazoth  Â·  3Comments

paulmarshall picture paulmarshall  Â·  3Comments