Thursday, April 23, 2015

Cross entropy and squared error and their derivatives

Cost function of target vs output might be expressed with cross entropy or with squared error. Output is generated by some function. It might be logistic or softmax. Depending on the combination of error and output function, there are different derivatives that are used for gradient computation. The following article gives very good explanation along with needed formulas:

No comments:

Post a Comment