Deep Learning Seminar  /  13. Februar 2020

An Introduction to Natural Language Processing (Part II)

Abstract:

[nur in Englisch verfügbar]

Continuing from last weeks lecture, in this talk I will focus on the task of machine translation.
I will start with the original encoder-decoder architectures, highlighting in particular their weaknesses. This will motivate the mechanism of attention and the construction of transformer networks. Lastly, I will present the BERT architecture and the state-of-the-art reformer networks.