Would be great to have out-of-the-box support for distributions, similar to functionality provided by TensorFlow Probability and PyTorch Distributions. My current use case is for Reinforcement Learning algorithms that learn stochastic action policies (i.e. learn parameters of a distribution from which actions are sampled), and I update these parameters using the likelihood.
MXNet would ideally have methods on each type of distribution for calculating:
And would support a variety of distributions including:
MXFusion is a related project but doesn't have the functionality mentioned above. And it would be ideal to have this as a submodule of the MXNet package.
@mxnet-label-bot [Feature Request, Gluon, Operators]
Some of them are here - https://mxnet.incubator.apache.org/api/python/ndarray/random.html?highlight=mxnet.ndarray.random#random-distribution-generator
But yes we do need a probability submodule within mxnet
Cheers @anirudhacharya, but the functionality I'm referring to is not covered by those references.
I'm talking about functionality beyond just sampling from distributions: calculations of probability density, log probability of a data sample given the distribution, entropy of distribution, etc. And as far as I'm aware the KL Loss in MXNet only works with samples, rather than the theoretical KL Divergence between distributions, so this is also insufficient for certain use cases.
You can't back propagate gradients through samples, so that's why it's important to have such formulas (e.g. log probability) implemented.
I can see a single case of probability being returned by mxnet.ndarray.random.multinomial but this is only for the sampled data point, and not calculated for an arbitrary data point which is required.
+1 on this feature
We have also the need for this as part of our project. I have a local version for computing PDF and LOG_PDF including forward/backward pass (aka gradients for all parameters and samples) for the following distributions: uniform/normal/exponential/gamma/poisson/neg-binomial/Dirichlet. All coded as C++ operators and working on CPU and GPU. But it would take some more effort to make it clean enough to commit them. Have to see when I find the time. Naturally we could extend this for supporting CDF etc.
Regarding more complex distributions like Multivariate-Gaussians, all necessary basic functionality exists already as part of the linalg-namespace in MXNet. I have plugged it together in python with a couple of lines. In addition build quite a bit around this that allows building more complex stuff (Gaussian Mixtures etc).
We are using all that stuff in a specific project and will likely not have the amount of time in near future to polish this all on our own up to the point where we can back contribute. But if other interesting parties are willing to join the effort, we can collaborate.
@asmushetzel these are awesome stuff. Really look forward to see the contribution back in the future. Do you have some distribution like https://www.tensorflow.org/api_docs/python/tf/initializers/truncated_normal ?
We are talking with MXFusion people. They don't have the PDFs mentioned in here coded and are not planning to do so.
Concerning truncated_normal above, I think this request is primarily about a sampler (though we should provide PDF/PMF for all samplers that we support in MXNet anyway). Building such a sampler should be small work when put into the framework of the already existing ones (normal, uniform, gamma, etc). I will see whether I can get some resources.
PR #14579 will bring in log-pdf/pdf of almost all distributions mentioned above.
For technical reasons, the PR #14579 has been moved to a new PR #14617
Most helpful comment
We have also the need for this as part of our project. I have a local version for computing PDF and LOG_PDF including forward/backward pass (aka gradients for all parameters and samples) for the following distributions: uniform/normal/exponential/gamma/poisson/neg-binomial/Dirichlet. All coded as C++ operators and working on CPU and GPU. But it would take some more effort to make it clean enough to commit them. Have to see when I find the time. Naturally we could extend this for supporting CDF etc.
Regarding more complex distributions like Multivariate-Gaussians, all necessary basic functionality exists already as part of the linalg-namespace in MXNet. I have plugged it together in python with a couple of lines. In addition build quite a bit around this that allows building more complex stuff (Gaussian Mixtures etc).
We are using all that stuff in a specific project and will likely not have the amount of time in near future to polish this all on our own up to the point where we can back contribute. But if other interesting parties are willing to join the effort, we can collaborate.