Please write about a deep learning expert in your README.md.
he/she can be a professor (e.g., Yann LeCun), a Ph.D student (e.g., Joseph Chet Redmon), a hacker (e.g., Flood Sung), a researcher (e.g., John Schulman), an enginner (e.g., Soumith Chintala), an entrepreneur (e.g., Matthew Zeiler), etc.
To avoid writing the same person, please report the person's name in
https://docs.google.com/spreadsheets/d/153XruMO7DPONzBTkxh8ZoYSto1E_2zO021vs0prWZ_Q/edit?usp=sharing
First come first serve!
Sepp Hochreiter born Josef Hochreiter in 1967 is a German computer scientist.
Since 2006 he has been head of the Institute of Bioinformatics at the Johannes Kepler University of Linz.
He was at the Technical University of Berlin, at the University of Colorado at Boulder, and at the Technical University of Munich.
Sepp Hochreiter has made numerous contributions in the fields of machine learning and bioinformatics.
He launched the Bioinformatics Working Group at the Austrian Computer Society.
He is founding board member of different bioinformatics start-up companies.
He was program chair of the conference Bioinformatics Research and Development.
He is a conference chair of the conference Critical Assessment of Massive Data Analysis.
He is editor, program committee member, and reviewer for international journals and conferences.
Sepp Hochreiter developed the long short-term memory (LSTM) for which the first results were reported in his diploma thesis in 1991.
LSTM overcomes the problem of recurrent neural networks (RNNs) and deep networks to forget information over time or through layers (vanishing or exploding gradient).
LSTM learns from training sequences to process new sequences in order to produce an output or generate an output sequence.
LSTM cells solved numerous tasks in automatic music composition, machine translation, speech recognition, reinforcement learning, and robotics.
LSTM networks are used in Google Voice transcription, Google voice search and Google's Allo as core technology for voice searches and commands in the Google App (on Android and iOS), and for dictation on Android devices. Also Apple applies LSTM since iOS 10 in the "Quicktype" function.
Sepp Hochreiter's group introduced "exponential linear units" (ELUs) which speed up learning in deep neural networks and lead to higher classification accuracies. Like rectified linear units (ReLUs), leaky ReLUs (LReLUs), and parametrized ReLUs (PReLUs), ELUs alleviate the vanishing gradient problem via the identity for positive values. However, ELUs have improved learning characteristics compared to ReLUs, due to negative values which push mean unit activations closer to zero. Mean shifts toward zero speed up learning by bringing the normal gradient closer to the unit natural gradient because of a reduced bias shift effect.