Hi, I'm really impressed by your works.
I want to replace the ResNet-based backbone networks to my WR-Inception Networks that is rather faster.
Could you advice how to import my networks?
P.S : I alreday had the WR-Inception network models that was trained by using tf-slim.
Interesting! I'm curious to see if that would be faster or more accurate. The main function you'll need to replace is resnet_graph() in model.py. This function builds the ResNet101 (or resnet50) and returns the last layer of each stage, [C1, C2, C3, C4, C5].
The cleanest solution is to implement your WR-Inception network using Keras. And write a script to load your trained weights into Numpy arrays and assign them to the corresponding Keras layers. Then train the model as usual. I don't know if you're using tf-slim's built-in implementation of resnet, but it may be worth noting that the tf-slim implementation is slightly different from the original resnet paper.
Alternatively, if you prefer to avoid re-writing your model in Keras then note that resnet_graph() returns Keras layers, not TF tensors. You might be able to just wrap your tf-slim tensors with Lambda layers. I use Lambda layers in the code in many places. If that doesn't fit your needs then the alternative is to build custom Keras layers that encapsulate your model. There are also examples of those in model.py.
Hope this helps. I'll keep this issue open for any follow up questions. Please let me know how your experiment works out.
Most helpful comment
Interesting! I'm curious to see if that would be faster or more accurate. The main function you'll need to replace is
resnet_graph()inmodel.py. This function builds the ResNet101 (or resnet50) and returns the last layer of each stage, [C1, C2, C3, C4, C5].The cleanest solution is to implement your WR-Inception network using Keras. And write a script to load your trained weights into Numpy arrays and assign them to the corresponding Keras layers. Then train the model as usual. I don't know if you're using tf-slim's built-in implementation of resnet, but it may be worth noting that the tf-slim implementation is slightly different from the original resnet paper.
Alternatively, if you prefer to avoid re-writing your model in Keras then note that
resnet_graph()returns Keras layers, not TF tensors. You might be able to just wrap your tf-slim tensors with Lambda layers. I use Lambda layers in the code in many places. If that doesn't fit your needs then the alternative is to build custom Keras layers that encapsulate your model. There are also examples of those inmodel.py.Hope this helps. I'll keep this issue open for any follow up questions. Please let me know how your experiment works out.