You (don't) know backpropagation

by Alberto Massidda

AI/Machine Learning English
play_circle_outline
Topic
AI/Machine Learning
Language
English
Description

This webinar would want to shed some light on the internals of the most fundamental phase of neural network training: the backpropagation. Almost everybody knows that, in practice, we compare the error with the expected result and change our weights depending on the margin. But fewer know what is exactly happening under the hood. We'll talk about computational graphs, what is a gradient and how the architecture of the network changes the flow of gradients.

Alberto Massidda
End-to-end Data Scientist, Sourcesense

Computer engineer with 11 years of experience, specialized in mission critical, high traffic, high available Linux architectures and infrastructures (before the cloud was out), with a relevant experience in development and management of web services. He has served as Infrastructure Lead in 4 companies (Translated, N26, Wanderio, Klar) and participated in 2 EU multimillion funded NLP research projects (MateCAT, ModernMT). Alberto has a variegated bundle of experience, that ranges from devops to machine learning, from the corporate banking to the mutable startup world.