Keras: Error backpropagation in Keras

Created on 30 Apr 2016  路  8Comments  路  Source: keras-team/keras

Hi,

I'm new to Keras and currently evaluating the library for my research purpose. Could someone clarify if Keras supports error backpropagation to train a neural network built using DenseLayers? I could not find any references in the documentation. I'm working on a multi-class classification problem and the objective function is 'categorical_crossentropy'.

stale

All 8 comments

Please reference to keras/examples folder, there are lots of good examples.
For example, mnist example shows you how to build a multi-class classification problem.

It's not clear from the example if the model builds a feedforward NN or uses error backpropagation. That's what I would like to get clarified.

I think that your lack of knowledge on the subject makes you misunderstand the difference between terms "feed-forward neural networks" and "error backpropagation".

Feed-forward NNs are just a type of NNs that propagate the signal in one direction (forward), so to speak - examples are multi-layer perceptrons, convolutional networks... On the other hand, error backpropagation is a general term for an approach for training neural networks.

I think he is aware of the difference between FFNN and BP, when he says "or" he actually means "and whether". To answer the question. My understanding is that Keras is built on top of Theano which employs automatic differentiation for whatever model you set up (including FFNN which would be anything where you stack non recurrent layers on top of each other). BP is nothing more than an application of the chain rule which enables you to build up derivatives w.r.t. all weights in the network recursively which saves time compared to naively computing them. Whatever Theano does ultimately translates into something akin to BP. But why worry about that? The whole point of Theano is that you no longer have to think about this part of your problem, just treat it as a given working technology, just like you treat BLAS.

@BaldwinTheThird that helps, thanks.

So, how do we make sure Dense layer uses Backpropagation and changes weights according to that? I didn't see any Backpropagation+FFNN example.

Fundamental idea is that Keras use Theano/Tensorflow's auto gradient computation. If you want to understand how Keras use BP to compute gradients and update weights (no limit to any kind of model including FFNN), please refer to its code.

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.

Was this page helpful?
0 / 5 - 0 ratings