Natural Language Processing(Part 19)-Log Likelihood, Part 2

Coursesteach
3 min readNov 26, 2023

--

📚Chapter 3: Sentiment Analysis (Naive Bayes)

We will continue from the previous Tutorial and show you how to do inference. Given your lambda dictionary the task is pretty straightforward. You have done most of the work to arrive at your log likelihood already. Now let’s wrap up. Now you can calculate the log likelihood of the tweet as the sum of the lambdas from each word in the tweet. So for word I you add 0, for am you add 0, for word, happy you had 2.2, for words because I and am you add 0 and for word learning you add 1.1. This sum is 3.3 and this value is higher than 0.

Remember how previously you saw that the tweet was positive if the product was bigger than 1 with the log of 1 equal to 0. The positive values indicate that the tweet is positive, a value less than 0 would indicate that the tweet is negative. The log likelihood for the suite is 3.3 and since 3.3 is bigger than 0, the tweet is positive. Notice that this core is based entirely on the words happy and learning both of which carry a positive sentiment. All the other words were neutral and didn’t contribute to the score. See how much influence the power words have.

Summary

Let’s do a quick recap. You used your new skills to predict the sentiment of a tweet by summing all the lambdas for each word that appeared in the tweet. This score is called the log likelihood for the log likelihood, the decision threshold is zero instead of one. Positive tweets will have a positive lock likelihood above 0, and negative tweets will have a negative lock likelihood below 0, well done. Congratulations now that you have a good understanding of how to compute the log likelihood and how to do inference, I will proceed to show you how to train a naive based model.

Please Follow and 👏 Clap for the story courses teach to see latest updates on this story

If you want to learn more about these topics: Python, Machine Learning Data Science, Statistic For Machine learning, Linear Algebra for Machine learning Computer Vision and Research

Then Login and Enroll in Coursesteach to get fantastic content in the data field.

Stay tuned for our upcoming articles where we will explore specific topics related to NLP in more detail!

Remember, learning is a continuous process. So keep learning and keep creating and sharing with others!💻✌️

Note:if you are a NLP export and have some good suggestions to improve this blog to share, you write comments and contribute.

if you need more update about NLP and want to contribute then following and enroll in following

👉Course: Natural Language Processing (NLP)

👉📚GitHub Repository

👉 📝Notebook

Do you want to get into data science and AI and need help figuring out how? I can offer you research supervision and long-term career mentoring.
Skype: themushtaq48, email:mushtaqmsit@gmail.com

Contribution: We would love your help in making coursesteach community even better! If you want to contribute in some courses , or if you have any suggestions for improvement in any coursesteach content, feel free to contact and follow.

Together, let’s make this the best AI learning Community! 🚀

👉WhatsApp

👉 Facebook

👉Github

👉LinkedIn

👉Youtube

👉Twitter

References

1- Natural Language Processing with Classification and Vector Spaces

--

--