Natural Language Processing in Python (NEW for 2025!)
Develop essential programming & development skills with expert instruction and practical examples.
Skills you'll gain:
Skill Level
Requirements
Who This Course Is For
About This Course
This is a practical, hands-on course designed to give you a comprehensive overview of all the essential concepts for modern Natural Language Processing (NLP) in Python. We'll start by reviewing the history and evolution of NLP over the past 70 years, including the most popular architecture at the moment, Transformers. We'll also walk through the initial text preprocessing steps required for modeling, where you'll learn how to clean and normalize data with pandas and spaCy, then vectorize that data into a Document-Term Matrix using both word counts and TF-IDF scores.
After that, the course is split into two parts:The first half covers traditional machine learning techniquesThe second half covers modern deep learning and LLM (large language model) approachesFor the traditional NLP applications, we'll begin with Sentiment Analysis to determine the positivity or negativity of text using the VADER library. Then we'll cover Text Classification on labeled data with Naïve Bayes, as well as Topic Modeling on unlabeled data using Non-Negative Matrix Factorization, all using the scikit-learn library. Once you have a solid understanding of the foundational NLP concepts, we'll move on to the second half of the course on modern NLP techniques, which covers the major advancements in NLP and the data science mindset shift over the past decade.
We'll start with the basic building blocks of modern NLP techniques, which are neural networks. You'll learn how neural networks are trained, become familiar with key terms like layers, nodes, weights, and activation functions, and then get introduced to popular deep learning architectures and their practical applications. After that, we'll talk about Transformers, the architectures behind popular LLMs like ChatGPT, Gemini, and Claude.
We'll cover how the main layers work and what they do, including embeddings, attention, and feedforward neural networks. We'll also review the differences between encoder-only, decoder-only, and encoder-decoder models, and the types of LLMs that fall into each category. Last but not least, we're going to apply what we've learned with Python.
Topics Covered
Course Details
View pricing and check out the reviews. See what other learners had to say about the course.
This course includes:
Not sure if this is right for you?
Browse More Programming & Development CoursesContinue Your Learning Journey
Explore more Programming & Development courses to deepen your skills and advance your expertise.