Faktion Academy.
Transformers and their friends in NLP
This course is about using state-of-the-art techniques for building Natural Language Processing models
Course outline
General concepts
- Tokenisation, parsing, …
- Embeddings and transfer learning
- Labeling data correctly
Deep Learning Architectures
- LSTM
- Contextualized String Embeddings using the ELMo model
- Attention mechanisms
- Transformer architecture
- BERT model
- BERT’s derivatives and when to use them
Case studies and exercises
- E-mail automation
- Document information extraction
- NLP for search
Course level
Expert
Prerequisites
- Python programming at the intermediate level
- Knowledge of basic TensorFlow concepts (we use the Keras API)
- Knowledge of basic Machine Learning concepts like data splitting, classification, overfitting, probabilities, ...
Course teachers
Aleksandra Vercauteren, PhD
Head of NLP
Kaja Zupanc, PhD
Senior NLP Engineer
Course fee
EUR 4.500 excl. VAT
Included in course package
- Course material
- Drinks, snacks and lunch
- Cloud servers for use during training
- 4h of support and question answering up to 6 months after the course