You (don't) know backpropagation
This webinar would want to shed some light on the internals of the most fundamental phase of neural network training: the backpropagation. Almost everybody knows that, in practice, we compare the error with the expected result and change our weights depending on the margin. But fewer know what is exactly happening under the hood. We'll talk about computational graphs, what is a gradient and how the architecture of the network changes the flow of gradients.