JH
Oct 4, 2020
Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks
LL
Jun 22, 2021
This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.
By George L
•Apr 11, 2021
Younes is a bad teacher. He may have good technical chops, but teaching is a different skill altogether. Overall, the NLP specialization design is much much worse compared with the DL Specialization. On one hand, you were taught a lot of stuff that are deep but cursory, on the other hand, the excises are either too difficult for you to get any clue or most of the time actually too simple and you only need to enter simple parameters, therefore cannot really learn anything! I really don't know why there are so many people giving 5 star rating!
By Rishabh S
•Sep 18, 2021
The course is very research oriented and not very useful for data science practitioners. No time was spent on explaining how transformers can be used for NLP tasks using a small domain or company specific corpus through transfer learning. I'm not planning to develop the next blockbuster NN architecture for NLP and so the intricate details of how transformer and reformer works seemed like an overkill. Lastly, using Trax instead of the more production ready frameworks like Tensorflow also made it feel very research focussed.
By JL B
•Nov 8, 2020
Maybe my fault but at some point in these courses I got lost in the logic and the whys of the networks constructions. I managed the assignments because for some to pass you only need to know how to copy and paste.
But I reckon the great value of the material, I think I'll need to revisit and spend more time on the optional readings.
And still overall a great specialization, thanks to all the persons involved in these courses !
By Zephyr F
•Sep 27, 2022
The Deep-Learning framework Trax in this course only increases unnecessary difficulties for finishing the assignment. I don't understand why they did not use more common frameworks such as PyTorch and Tensorflow. It seems that the instructor only read after the script while presenting the slides. For example, there was an obvious error on the slide of the transformer decoder, and the instructor did not correct it.
By Israel T
•Oct 7, 2020
Very educational! I learned a lot about the different NLP models. However, it seems like week 3 and week 4 were rushed. Also, some of the items (e.g. what each layers does and why do we need that layer) were not properly explained. Other than that, this is a good course to have a general overview on some of the state of the art NLP models.
By Mark L
•Oct 2, 2021
(1) Please consider switching from Trax to Tensorflow. (2) The concepts of Transformers, particularly some explanation of why Q, K and V are called such, would be helpful to go over in more detail. (3) Not a problem of the course, but it would be helpful if the Trax documentation were more complete.
By Felix M
•Apr 11, 2021
The classes originally taught by Andrew were for me much better. Many of the explanations in this course were not very clear and superficial as I see it.
By Damian S
•Feb 24, 2022
Course content is fantanstic, but assignments are ridiculous--they test how well you can read directions, but not how well you understand the content.
By Haoyu R
•Oct 2, 2020
Not as details as enough. The quality of the course is very good at the start but decreases as the topics go deeper.
By Kévin S
•Feb 15, 2022
Look like an commercial AD for Trax. I don't know if I will be able to re-implement this in another framework.
By Chenjie Y
•Nov 18, 2020
I think the last course is a bit rush... Many concepts are not natural and cannot be explained by one or two sentences. Comparing to the previous courses in the specialisation which really explains concepts and intuitions in detail, this last course is a bit too rough. I would rather spend another month to study the materials in two courses, instead of staying up late to read papers and blogs to understand what was not explained clearly in the course. And also, i see that trax is a good library but i think up to now it is not yet mature, and i really wish all the assignments can have tensorflow versions and let the students to choose.
By Darren
•Feb 7, 2022
The general content is good, but there are so many insonsistencies and missing pieces of information in the material. Terms are poorly defined and used inconsistently. Lots of information about "why" certain things are the way they are in the programming assignments is missing -- you just "do it" without understanding it. Also, the instructors have abandoned the course forums. Lots of questions about content in the discussion forums, but none of the content creators are helping answer the questions. We're just left to fend for ourselves. Not worth the money. Just watch the videos.
By Hùng N T
•Feb 26, 2024
Everything was good except that this course uses Trax. This framework has yet to have any new releases since 2021, and I cannot manage to train deep learning models using Trax on my GPU, not even possible in Colab. Trax is also very buggy and it does not have a large community to help. Recommendation for learners: take the course after it is fully rebooted to TensorFlow unless you want to take other courses to get useful/working code for NLP.
By Lucky S
•Feb 24, 2022
This Course is the weakest course of this Specialization.
Course 1 - 3 was very strong and solid. But Course 4 feels very rushed. The Curriculum is very hard to follow, let alone to understand. The Lab wasn't commented enough to give us proper explanation (Especially week 4). There are a lot of concept that isn't explained at great length when it should.
By Dimitry I
•Apr 17, 2021
Material coverage is very superficial. Do not expect to fully understand or be able to work with Attention models after doing this course.
Sadly, these types of courses and their fake near 5-star reviews are destroying Coursera.
By David M
•Feb 22, 2021
Unfortunately, the classes are given at a very primitive level without explaining what exactly Attention models do. The programming exercises were not explained well, either
By Yuri C
•Jan 6, 2021
The last course in the NLP specialization is intense! Already in the first week the learner is put through its tensor algebra baptism and it goes even deeper while building the locality-sensite hashing inner workings. I am very grateful to the team to have put so much effort in teaching us how attention works and how to improve it in building the Reformer model. The opportunity to get this material from some of the developers of the model is priceless! Thank you for that! Surely, in everyday NLP one uses directly the layers provided by Trax mostly. But the understanding about the drawbacks and the ideas behind these models is indeed the unique selling proposition of this whole course. The provided infographics are deeply helpful for understanding what goes on with the tensors inside the models and the instructors do the best to introduce those ideas throughout the course. I was also *very* impressed to see how much up-to-date all the material of this latest course is! Some of the papers about the models were put in arXiv 1-2 years ago. This is by far very hard to beat in any massive open online course! Thank you very much for providing this for the community at a such an accessible price tag. I will be eagerly waiting for a continuation of this specialization as Advanced NLP!
By SNEHOTOSH K B
•Nov 21, 2020
The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.
By D B
•Jan 25, 2023
I learned a lot from this course, and the ungraded and graded problems are relevant to understanding and knowing how to build a transformer or a reformer from scratch
By satish b
•Jan 1, 2021
One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.
By Jonathan M
•Nov 16, 2020
The course was wonderful, full of updated content and explained in a really good way. Good work!
By Akash M
•Sep 26, 2020
Outstanding Course. The course was rigorous
By Sarkis K
•Apr 17, 2023
The courses have really enlightened me on NLP. I had no idea about the techniques. I'll give it 4 stars, because the course instructors have a monotonicity of lecturing as if reading from a teleprompter with a fake synthetic voice. It sometimes gives me a headache and I end up muting the videos and just reading the subtitles (which a lot of times don't make sense and are short paces so I have to freeze the screen, and open 2 other windows and read the lower caption text). I have been doing many courses on this platforms, and even though the instructors are on the top of their fields, but the way they deliver the courses is just "sometimes" and "not always" painful. I am sure this is not how they teach there own classes, especially in Stanford. Even though the course is 50$ per month, a think it won't cost the instructors much to show some authentic enthusiasm.
By Simon P
•Dec 6, 2020
The course could have been expanded to an entire specialization. There's a little too much information and the first two assignments are disproportionately long and hard compared with the last two. It is cutting edge material though, and well worth it.
Slight annoyance at the script reading meaning the videos lack a natural flow and you end up with nonsense sentences like "now we multiply double-uwe sub en superscript dee by kay sub eye superscript jay to get vee sub eye". Variables such as X_i should be referred to by what they actually represent and not the algebraic representation, because this is not how the brain processes them when they are read from a page.
By Dave J
•May 3, 2021
The content is interesting and current, citing some 2020 papers. I was disappointed by the amount of lecture material - around 40-45 minutes per week in weeks 1-3 and only 20 minutes in week 4, plus two Heroes of NLP interviews. The lectures have the feel of reading from a script rather than engaging with the learner. They're not bad but there's room for improvement. Explanations are usually adequate but some areas could have been explained more clearly.
Programming assignments worked smoothly in my experience, though not particularly challenging: they're largely "painting by numbers".