State-of-the-art deep learning for AI: Transformers
Transformers are the Swiss Army Knife of deep learning. They revolutionized Natural Language Processing (NLP), giving rise to models like BERT and GPT-3, which outperformed previous models on a host of tasks. Imagine an AI writing an essay, answering questions, or even composing poetry — Transformers made that possible. They’re not just limited to NLP; they’re being used in computer vision and other data types, making them a versatile tool in the AI toolbox.
Praktische info:
Inschrijven?
- Voorwaarden: Basic knowledge about NLP
- Prijs: determined upon registration
In this course, we give you a clear understanding on the development of language models, and how neural networks play an essential role in this task. Starting from simple feedforward and recurrent neural network frameworks, we explore the developments over time, and dive deeper in more complex neural network frameworks like encoder-decoder structures and Transformers, i.e., the state-of-the-art underlying technique nowadays used in many AI applications. Aspects like positional encoding, word embeddings, multi-head attention mechanisms, etc. will be covered during this course.
Lesgever/spreker
Martial Luyts
Nowadays, I am active as Postdoctoral associate in statistics at LSTAT & FLAMES coördinator and trainer @ KULeuven.
In the past, I also developed experience in the professional world by working as a data scientist/data analyst for Keyrus NV, a global leader in consulting and solutions integration for business intelligence and performance management. Here, I had been involved in different big datascience and BI projects such as text mining, web scraping, datawarehousing, etc.
Gerelateerde opleidingen
Quantum Computing: fundamenten, algoritmes en use cases
Opleiding - Gent - KU Leuven PUC Continue