I am working on a port of this project: https://github.com/jaredleekatzman/DeepSurv to use Keras instead of Lasagne.
My question centers upon how to best integrate a new custom loss function (cox proportional hazards regression for example) into the Keras paradigm:
def negative_log_likelihood(ytime, ystatus):
LL_i = T.switch(T.eq(ystatus[i],1), self.theta - T.log(T.sum(self.exp_theta * T.gt(ytime, ytime[i]))),0)
Any pointers to get started would be of great assistance and sincerely appreciated!
You probably want to do something like
model.compile(optimizer, negative_log_likelihood, ...)
This doesn't work if you want to save and reload a model.
Interesting. What would be the best way to approach this to still use the standard model persistence options in Keras?
It could be fixed easily by allowing the load_model function to take kwargs that match the model.compile arguments and override those from the deserialized model.
To reload the model saved with custom loss function try
python
load_model(model_path, custom_objects={"negative_log_likelihood":negative_log_likelihood}) ```
I'll certainly investigate the custom objects approach and get back to you all when I get the chance.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs, but feel free to re-open it if needed.
This is also a problem for many other machine learning problems where the loss function could not be written as sum of separable parts and each of the components contains only one data sample.
Most helpful comment
To reload the model saved with custom loss function try
python load_model(model_path, custom_objects={"negative_log_likelihood":negative_log_likelihood}) ```