Subscribe to DSC Newsletter

Elegant Representation of Forward and Back Propagation in Neural Networks

Sometimes, you see a diagram and it gives you an ‘aha ha’ moment. Here is one representing forward propagation and back propagation in a neural network:

A brief explanation is:

  • Using the input variables x and y, The forwardpass (left half of the figure) calculates output z as a function of x and y i.e. f(x,y)
  • The right side of the figures shows the backwardpass.
  • Receiving dL/dz (the derivative of the total loss with respect to the output z) , we can calculate the individual  gradients of x and y on the loss function by applying the chain rule, as shown in the figure.

A more detailed explanation below from me.

Read full article here

Views: 48


You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

On Data Science Central

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service