I hope someone could give a hint how to use the model in tensorflow.js. I suppose the .pb is a Frozen GraphDef? There is a converter tool (https://github.com/tensorflow/tfjs-converter) however I miss the "--output_node_names". Exporting directly for tensorflow.js would be even greater of course.
⚠Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.
@kasperkamperman
Thanks for the feedback! We are currently investigating and will update you shortly.
@kasperkamperman
I have assigned the issue to the content author to investigate further and update the document as appropriate.
This is a good question. I don't actually think anyone on the team has tried to get this to play nice with TensorFlow.js yet..
I'm curious if there is any progress already?
I think there is no need of converting the model into frozen GraphDef because we can use .pb model directly with python
@ksujanpaul I don't want to use it with Python but with tensorflow.js... (which is javascript, so it's not related to Tensforflow for Python).
From my research I also understand that there are also different pb files, but I can't find which type Azure Custom Vision exports.
I found the tfjs converter. This is to convert a TensorFlow SavedModel (is the *.pb file from Azure a SavedModel?) or Keras model to a web-friendly format. However I need to fill in "output_node_names" (how do I get these?). I'm also not 100% sure if my pb file for Android is equal to a "tf_saved_model".
I hope someone has a tip or a starting point.
With tensorboard I could view the Graph, is the 'output_node_names' simply 'import'?
@anrothMSFT is there any progress on this issue?
@anrothMSFT any updates on this?
Thank you for taking the time to share your product and documentation feedback with us. Your input is valued because it helps us create the right documentation for our customers. Due to the volume of issues in our queue, we are closing open issues older than 90 days. We hope to continue hearing from you. Thank you.
I hope someone could give a hint how to use the model in tensorflow.js. I suppose the .pb is a Frozen GraphDef? There is a converter tool (https://github.com/tensorflow/tfjs-converter) however I miss the "--output_node_names". Exporting directly for tensorflow.js would be even greater of course.
Document details
⚠_Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking._
- ID: 916de6f5-ef41-c82b-5918-0c6149326aab
- Version Independent ID: c7a4bb67-957d-f0a3-4a9f-e71035923f58
- Content: Export your model to mobile
- Content Source: articles/cognitive-services/Custom-Vision-Service/export-your-model.md
- Service: cognitive-services
- GitHub Login: @anrothMSFT
- Microsoft Alias: anroth
Its closed issue but im wondering this as well. Did you manage to get it working somehow? I tried stack overflow link by jtlz2 - to no avail.
Thanks for bumping this. It has generated a hallway discussion this AM about whether to add TF.js export as an export flavor (we added TF lite last week, although release notes are lagging.) Something for us to keep discussing
@harishkrishnav on our team has been playing around with tf.js and might be able to provide some hints in the interim...
This is a bit tricky as is and we might soon release a sample for how to load Custom Vision models on the browser using TensorFlow.js. These are the steps I did to get an exported TensorFlow model working for me:
input_filepath
is the path to the .pb model file and output_filepath
is the full path of the updated .pb file that will be created. import tensorflow as tf
def ReplacePadV2(input_filepath, output_filepath):
graph_def = tf.GraphDef()
with open(input_filepath, 'rb') as f:
graph_def.ParseFromString(f.read())
for node in graph_def.node:
if node.op == 'PadV2':
node.op = 'Pad'
del node.input[-1]
print("Replaced PadV2 node: {}".format(node.name))
with open(output_filepath, 'wb') as f:
f.write(graph_def.SerializeToString())
--input_format
as tf_frozen_model
and set output_node_names
as model_outputs
. This is the command I used.tensorflowjs_converter --input_format=tf_frozen_model --output_json=true --output_node_names='model_outputs' --saved_model_tags=serve path\to\modified\model.pb folder\to\save\converted\output
Ideally, tf.loadGraphModel('path/to/converted/model.json')
should now work (tested for tfjs 1.0.0 and above).
@harishkrishnav It worked! Thank you so much!
@harishkrishnav Fantastic - do you want to take the SO answer?
Most helpful comment
@anrothMSFT is there any progress on this issue?