Natural Language Processing acceleration: foundations and applications
Two complementary courses on Deep Learning and NLP
Twice a year, winter and summer, in beatiful San Sebastian, Basque Country
Next edition: July, 2023, fully online (including tutorized labs)
Deep Learning neural network models have been successfully applied to natural language processing, and are now changing radically how we interact with machines (Siri, Alexa, machine translation, GPT4 or Bing Chat to name a few). Large Language Models are at the core of these developments, and are being also used to crack "languages" in other disciplines, ranging from programming languages (Copilot) to proteins (AlphaFold) and gene sequences (GenSLM).
You can take either one or both courses:
Deep Learning for NLP
(July 10th to 14th, 20 hours, 5 afternoons. 12th edition):
This course introduces in detail the machinery that makes Deep Learning work for NLP, including the latest transformers and large language models like GPT, BERT and T5. It also covers the use of prompts for zero-shot and few-shot learning, instruction learning and human feedback. In the winter edition multimodal text-image models like DALL-E are also covered. The course combines theoretical and practical hands-on classes. Attendants will be able to understand the internal working of the models, and implement them in Tensorflow in the tutorized labs. The summer version includes higher level labs using Keras, while the winter version includes the programming of the inner workings. The aim is to allow attendees to acquire the ability to understand, modify and apply current and future Deep Learning models to NLP and other areas.
Introduction to LT Applications
(July 17th to 21st, 20 hours, 5 afternoons. 5th edition):
This course will introduce the most commonly used techniques to build applications based on Language Technology. Thus, the attendees will learn how to apply techniques such as document classification, sequence labeling, as well as vector-based word representations (embeddings) and pretrained language models for core applications such as Opinion Mining, Named Entity Recognition, Fake News Detection or Question Answering. The course will have a practical focus, learning to use readily available LT toolkits (Spacy, Flair, Huggingface Transformers). The aim is to allow attendees to acquire the required autonomy to solve practical problems by applying and developing Language Technology applications.
Registration has to be done for each course, see respective websites. Note that you only need to pay the insurance for one of the courses.