You (don't) know backpropagation

by Alberto Massidda

AI/Machine Learning English
play_circle_outline
Topic
AI/Machine Learning
Language
English
Description

This webinar would want to shed some light on the internals of the most fundamental phase of neural network training: the backpropagation. Almost everybody knows that, in practice, we compare the error with the expected result and change our weights depending on the margin. But fewer know what is exactly happening under the hood. We'll talk about computational graphs, what is a gradient and how the architecture of the network changes the flow of gradients.

Alberto Massidda
Production Engineer, Meta

Computer engineer since 2008, specialized in mission critical, high traffic, high available Linux architectures and infrastructures (before the cloud was out), with a relevant experience in development and management of web services. Infrastructure Lead, SRE, AI researcher, university Teaching Assistant, opensource dev, worked among others at Translated, N26, Meta. Alberto has a variegated bundle of experience, that ranges from devops to machine learning, from the corporate banking to the mutable startup world.