Better visualization of params,size et.al. using torchsummaryX
Now in pytorch-lightning, visualization of the model is generated according to the layers listed in __build_model function. However, the layers in the function is not necessarily used, and not necessarily used for only one time, which may make the visualization of the model incorrcet.
In my point of view, torchsummaryX could be intergrated into pytorch-lightning.
Torchsummary X is an improved visualization tool of torchsummary. It visualizes kernel size, output shape, params, and Mult-Adds. It genreates information of the model by actually runnning the forward function, which make the information of the model more accurent. Also, it has a little bit more information than the ModelSummary function now using in pytorch-lightning.
cc: @PyTorchLightning/core-contributors
I have a PR in the works #1773 that does input/output shapes properly (probably very similar to torchsummary) . With the layer summary class I added, it should be easy to extend it further to display weight shapes etc. without us needing to add a dependency on a library torchsummaryX. I suggest to go this path.
See also #1556
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
@awaelchli so is this done? :]
Depends how far we want to go. Currently our model summary shows num params, layer type, input- and output shapes.
@IncubatorShokuhou mentions that torchsummaryX also shows
If we don't want these, we can close the issue.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Was there not enough interest in this to continue? I'm interested in this being implemented.
@dav-ell sure, do you want to take it over?
@dav-ell sure go ahead. Please add me as reviewer when you send the PR. cheers!
Adding as a reminder for myself (or anyone who tackles this issue first): this should be made here. Probably doing the total values calculation before calling the function here, adding a new parameter to the function. The num of parameters for each layer is already stored in self.param_nums, but a num_trainable_parameters property would be useful in LayerSummary class. This could be achieved doing the same as num_parameters, but also checking if p.requires_grad is True. I couldn't find a test for this feature though. Maybe I missed something?
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team!
Most helpful comment
I have a PR in the works #1773 that does input/output shapes properly (probably very similar to torchsummary) . With the layer summary class I added, it should be easy to extend it further to display weight shapes etc. without us needing to add a dependency on a library torchsummaryX. I suggest to go this path.
See also #1556