Skip to main content


Backpropagation: How to roll a ball down a hill

This talk introduces the concept behind artificial neural networks. We walk through the move from perceptrons to sigmoid neurons, as well as the idea behind feed forward neural networks. Gradient descent and stochastic gradient descent is used to explain the need for the backpropagation algorithm, which allows us to quickly calculate the gradient of the cost function we use to measure how accurate our neural nets predictions are.

Project Members: Chris Manahan

Find the program that fits your life.

Learn about our coding, cybersecurity, and data analytics bootcamps offered on full-time and part-time schedules.