Chevron Left
Back to Natural Language Processing with Classification and Vector Spaces

Learner Reviews & Feedback for Natural Language Processing with Classification and Vector Spaces by DeepLearning.AI

4.6
stars
4,441 ratings

About the Course

In Course 1 of the Natural Language Processing Specialization, you will: a) Perform sentiment analysis of tweets using logistic regression and then naïve Bayes, b) Use vector space models to discover relationships between words and use PCA to reduce the dimensionality of the vector space and visualize those relationships, and c) Write a simple English to French translation algorithm using pre-computed word embeddings and locality-sensitive hashing to relate words via approximate k-nearest neighbor search. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

SJ

Jul 17, 2020

One of the best introductions to the fundamentals of NLP. It's not just deep learning, fundamentals are really important to know how things evolved over time. Literally the best NLP introduction ever.

MN

May 24, 2021

Great Course,

Very few courses where Algorithms like Knn, Logistic Regression, Naives Baye are implemented right from Scratch . and also it gives you thorough understanding of numpy and matplot.lib

Filter by:

576 - 600 of 876 Reviews for Natural Language Processing with Classification and Vector Spaces

By Zoizou A

Oct 25, 2020

amazing

By Muhammad A B

Oct 1, 2020

perfect

By Mohamed S

Sep 8, 2020

PERFECT

By beomseok l

Jan 8, 2024

Great!

By WLSC

Mar 1, 2023

great!

By Thành H Đ T

Oct 14, 2021

thanks

By Prateek S P

Jan 17, 2021

thanks

By Jeff D

Nov 8, 2020

Thanks

By Rafael C F d A

Sep 28, 2020

Great!

By Kamlesh C

Aug 30, 2020

Thanks

By Qamar A

Aug 5, 2020

Cool!!

By ilham k

Aug 16, 2023

bagus

By Mahesh

Apr 17, 2023

fghrt

By Hemchand C

Mar 11, 2023

.....

By B21DCCN436 N Q H

Feb 14, 2023

grate

By Prins K

Jul 28, 2021

Great

By 克軒廖

Feb 5, 2021

Nice!

By Efstathios C

Jul 16, 2024

Good

By 刘世壮

Dec 4, 2021

good

By GANNA H

Aug 4, 2021

good

By Ranjeet K

Mar 14, 2023

no

By Abhinav S

May 2, 2022

bk

By Dave J

Jan 1, 2021

Having previously completed the Deep Learning Specialization, I came to this course with the intention of completing the whole NLP specialization, rather than because I was especially interested in the content of this first course from that specialization.

The Deep Learning Specialization sets a high standard of teaching quality and I have to say I found this course is not quite to the same standard. It's pretty good but not as good. The instructors are very knowledgeable, they make the effort to explain each topic clearly and they do a pretty good job of that.

What I felt could be improved is providing context of where each topic fits into the broader picture of both the theory and current practice of NLP. I was often left feeling, why are we spending time on this particular topic? Is this technique used in current practice or is it just of didactic or historical interest? Great teachers always have the broader context in mind and make sure that students see how everything fits into the bigger picture and why it is worth studying.

Although techniques were clearly explained, I felt that the underlying concepts were sometimes less well explained. An example is vector representations of words: we were shown the use of vector arithmetic to find analogies, but without much in the way of explanation of how this is possible. To me, this was the wrong way around: it makes more sense to me to first build an understanding of the representations, then introduce the remarkable result that these representations allow finding analogies.

In this course, sentences are represented as a "bag of words". This is processing natural language in the way a food processor processes food: chopping it up into a word soup. Since one of the most fundamental aspects of language is its structure, this might seem a hopeless approach. However it gives surprisingly good results for some simple tasks such as classifying tweets as having positive or negative sentiment. If you've done course 5 of the Deep Learning Specialization (Sequence Models), this will feel like a step backwards. There's no deep learning in this course. But I signed up for the course knowing that, so I can't criticise it on that basis. I'm taking the view that this course lays the foundations for more advanced and current topics in the subsequent courses in the specialization and I look forward to getting onto those.

The labs and assignments generally work smoothly. There are a few inconsistencies and a couple of the hints were a bit misleading but generally OK. It's a bit paint-by-numbers though, filling in bits of code within functions rather than working out for yourself how to structure the code.

By Kaiquan M

Jan 22, 2022

This "Natural Language Processing with Classification and Vector Spaces" course covers: - Logistic regression - Laplacian smoothing, log likelihood, naive bayes models to predict sentiment of tweets - Euclidean distance, cosine similarity between word vectors to understand relationship between sets of text, and Principal Component Analysis - Language translation using rotation matrices, k-nearest neighbours and locality sensitive hashing The course has weekly lecture videos and has a summary reading after almost every video, which was especially helpful when trying to understand the concepts discussed in a video as a whole. There are also shorter labs to familiarise you with NLP concepts before the weekly graded programming assignment. Be sure to walk through and understand how the functions in utils_%.py accompanying each lab work. Similarly, walk through the functions in utils_%.py and how unit test cases are prepared in unittest.py accompanying each assignment. A good part of this course has been that the course team periodically releases new versions of the labs and assignments containing fixes or new approaches. Therefore bugs discovered by users in your assignment 3 months ago could already be fixed by the time you work on your assignment. The downside to the course is that the discussion forums were not actively monitored. Therefore there are some questions I have on certain concepts which were not answered by the time I completed the course.