Chevron Left
Back to Generative Pre-trained Transformers (GPT)

Learner Reviews & Feedback for Generative Pre-trained Transformers (GPT) by University of Glasgow

4.5
stars
29 ratings

About the Course

Large language models such as GPT-3.5, which powers ChatGPT, are changing how humans interact with computers and how computers can process text. This course will introduce the fundamental ideas of natural language processing and language modelling that underpin these large language models. We will explore the basics of how language models work, and the specifics of how newer neural-based approaches are built. We will examine the key innovations that have enabled Transformer-based large language models to become dominant in solving various language tasks. Finally, we will examine the challenges in applying these large language models to various problems including the ethical problems involved in their construction and use. Through hands-on labs, we will learn about the building blocks of Transformers and apply them for generating new text. These Python exercises step you through the process of applying a smaller language model and understanding how it can be evaluated and applied to various problems. Regular practice quizzes will help reinforce the knowledge and prepare you for the graded assessments....

Top reviews

RH

Jan 20, 2024

I liked the course, It was informative with a little of coding assignments. The coding assignments could be a bit more in depth.

CP

Feb 28, 2024

Great overview of GPT with some labs and very recent information. Deep Learning training is recommended.

Filter by:

1 - 9 of 9 Reviews for Generative Pre-trained Transformers (GPT)

By Joris C

•

Dec 21, 2023

good: tough tests bad: - course materials not available as pdf - nothing on algorithmic efficiency, scaling laws or emergence ugly: - w1: it's not said that it focuses on causal language models, 2nd exercise lacks method signature, smoothing not explained - w2: limitation of perplexity to causal language models not explained (cfr https://huggingface.co/docs/transformers/perplexity) - w3: sound quality, free article instead of nyt (https://arstechnica.com/tech-policy/2023/06/lawyers-have-real-bad-day-in-court-after-citing-fake-cases-made-up-by-chatgpt/), rlhf follows fine-tuning

By Anders L C

•

Oct 5, 2023

A really good course, especially the first two weeks. The third week is also okay, but completely different and not very technical. The course is a bit rough around the edges: a few quiz questions are ambiguously formulated and the Python notebooks sometimes have typos. Still, a good course to learn about language models and to some get hands-on experience.

By Milad A

•

Nov 10, 2023

I want to gain hands-on experience with GPT rather than knowing the things mentioned in Module 3. The whole course is a waste of time! I'd rather take other courses!

By Daniel M R

•

May 14, 2024

Good course overall. The video explanations could be a little more in-depth. And I think they could explore more labs than dedicating the entire last week to non-technical questions, or plus one more week maybe.

By Roel H

•

Jan 21, 2024

I liked the course, It was informative with a little of coding assignments. The coding assignments could be a bit more in depth.

By christophe p

•

Feb 29, 2024

Great overview of GPT with some labs and very recent information. Deep Learning training is recommended.

By Sabri D

•

Dec 3, 2023

Excellent !!!

By marco s

•

Mar 9, 2024

get into more details of transformers models

By John D

•

Oct 28, 2024

Lots of gaps and code mistakes and little to no discussion.