Tutorial on Deep Learning

Print Friendly, PDF & Email

Zaikun Xu from the Università della Svizzera Italiana

Zaikun Xu from the Università della Svizzera Italiana

In this video from the 2016 HPC Advisory Council Switzerland Conference, Zaikun Xu from the Università della Svizzera Italiana presents: Tutorial Part I: Deep Learning.

“In the past decade, deep learning as a life-changing technology, has gained a huge success on various tasks, including image recognition, speech recognition, machine translation, etc. Pio- neered by several research groups, Geoffrey Hinton (U Toronto), Yoshua Benjio (U Montreal), Yann LeCun(NYU), Juergen Schmiduhuber (IDSIA, Switzerland), Deep learning is a renaissance of neural network in the Big data era.

Neural network is a learning algorithm that consists of input layer, hidden layers and output layers, where each circle represents a neural and the each arrow connection associates with a weight. The way neural network learns is based on how different between the output of output layer and the ground truth, following by calculat- ing the gradients of this discrepancy w.r.b to the weights and adjust the weight accordingly. Ideally, it will find weights that maps input X to target y with error as lower as possible.”

“Deep neural nets are neural networks with many layers. Indeed, the winning solution of 2015 Imagenet competition features a very deep neural network with 152 layers.”

In this video, Zaikun Xu from the Università della Svizzera Italiana presents: Tutorial Part 2: Deep Learning.

See more talks in the Swiss Conference Video Gallery

Sign up for our insideHPC Newsletter