Natural Language Processing(Part 14)-Probability and Bayes’ Rule

Coursesteach
4 min readOct 15, 2023

--

📚Chapter 3: Sentiment Analysis (Naive Bayes)

Probability and Bayes’ Rule

Outline

  • Introduction
    Probabilities
    Probability of the Intersection

Probability is fundamental to many applications in NLP. You’ll see how you can use it to help classify whether a tweet is positive or negative. Let’s get started.

1- Introduction

To start, we are going to first review what’s probabilities and conditional probabilities are, how they operate, and how they can be expressed mathematically. Then I’ll go over how to derive Bayes rule from the definition of conditional probabilities. Bayes rule is applied in many different fields, ranging from medicine to education and is used extensively in NLP. Once you understand the theory behind Bayes rule, you can use it to perform sentiment analysis on tweets,

Imagine you have an extensive corpus of tweets that can be categorized as either positive or negative sentiment, but not both. Within that corpus, the word happy is sometimes being labeled positive and sometimes negative.

2- Probabilities

Let’s explore why this situation is occurring.

One way to think about probabilities is by counting how frequently events occur. Suppose you define event A as a tweets being labeled positive, then the probability of event A, shown as B of A here, is calculated as the ratio between the counts of positive tweets in the corpus divided by the total number of tweets in the corpus.

In this example, that number comes out to 13 over 20, or 0.65. You could also express this value as a percentage, 65 percent positive. It’s worth noting that the complimentary probability here, which is the probability of the tweets expressing a negative sentiment is just equal to one minus the probability of a positive sentiment.

Note that for this to be true, all tweets must be categorized as either positive or negative but not both. Let’s define Event B in a similar way by counting tweets containing the word happy. In this case, the total number of tweets containing the word happy, shown here as N-happy is 4.

3- Probability of the Intersection

Here’s another way of looking at it. Take a look at the section of the diagram were tweets are labeled positive and also contain the word happy. In the context of this diagram, the probability that a tweet is labeled positive and also contains the word happy is just the ratio of the area of the intersection divided by the area of the entire corpus.

In other words, if there were 20 tweets in the corpus, and three of them are labeled positive and also contain the word happy, then the associated probability is 3 divided by 20 or 0.15. You now know how to calculate the probability of an intersection. You know how to calculate the probability of a word, namely happy with the probability of being positive. In the next Tutorial, we willtalk about Naive Bayes.

Please Follow coursesteach to see latest updates on this story

If you want to learn more about these topics: Python, Machine Learning Data Science, Statistic For Machine learning, Linear Algebra for Machine learning Computer Vision and Research

Then Login and Enroll in Coursesteach to get fantastic content in the data field.

Stay tuned for our upcoming articles where we will explore specific topics related to NLP in more detail!

Remember, learning is a continuous process. So keep learning and keep creating and sharing with others!💻✌️

Note:if you are a NLP export and have some good suggestions to improve this blog to share, you write comments and contribute.

if you need more update about NLP and want to contribute then following and enroll in following

👉Course: Natural Language Processing (NLP)

👉📚GitHub Repository

👉 📝Notebook

Do you want to get into data science and AI and need help figuring out how? I can offer you research supervision and long-term career mentoring.
Skype: themushtaq48, email:mushtaqmsit@gmail.com

Contribution: We would love your help in making coursesteach community even better! If you want to contribute in some courses , or if you have any suggestions for improvement in any coursesteach content, feel free to contact and follow.

Together, let’s make this the best AI learning Community! 🚀

👉WhatsApp

👉 Facebook

👉Github

👉LinkedIn

👉Youtube

👉Twitter

References

1- Natural Language Processing with Classification and Vector Spaces

--

--