Machine Learning
In-depth understanding of Neural Networks and its working
Today using neural network architectures, we can achieve complex tasks like object recognition in images, automatic speech recognition (ASR), machine translation, image captioning, video classification, and a lot more that are challenging for machines to perform. It can do complex tasks using neural networks because artificial neural networks are the most powerful learning models in the field of machine learning. Neural networks can achieve arguably every task that the human brain can perform…
Demystifying NLP: Exploring Lexical, Syntactic, and Semantic Processing for Powerful Natural Language Understanding
In the post Unlocking the Power of Natural Language Processing: An Introduction and Pipeline Framework for Solving NLP Tasks we have explored high-level techniques and pipeline setup that can be typically used to solve NLP tasks. This post will further extend the idea and we will cover advanced techniques that we can apply to textual data. The process of text analytics involves three stages as given below: Lexical processing: In this stage, we do basic text preprocessing and text cleaning…
PMF, PDF and CDF and its implementation in Python
In our earlier post A complete guide to the Probability Distribution, we have developed a deep understanding of the different types of discrete and continuous possible probability distributions. Now, to calculate the probability of a random variable with its value equal to some value within the range, Probability Mass Function (PMF) is used. Next, let's move forward and understand how we can use PMF to calculate the probabilities of a random variable whether it is discrete or continuous.…
Building Block of Semantic Processing: Interpret the meaning of the text
In this post, we will understand the concept of how the field of NLP is attempting to make machines understand language like we humans do. We will understand different techniques that make this challenging task possible. This area is quite heavy on research and a lot of new advancements are happening in the field. Let us start understanding terms and concepts which are fundamentals to understand the meaning of text like we humans do. Terms and Concepts To make computers understand the…
Mastering Named Entity Recognition: Unveiling Techniques for Accurate Entity Extraction in NLP
In the expansive realm of Natural Language Processing (NLP), Information Extraction (IE) is a sophisticated technique that processes vast amounts of textual data to pinpoint and extract specific information, transforming the narrative into a structured format that machines can comprehend. It can help in identifying entities, relationships, or events helping machines to distill meaningful knowledge from the linguistic richness of human expression. In this post, we will build an information…
Decoding Language Structure: Exploring Constituency Parsing and Dependency Parsing in NLP
Parsing is one of the key tasks which simply means breaking down a given sentence into its 'grammatical constituents'. Parsing is an important step in many applications which helps us better understand the linguistic structure of sentences. Let’s understand parsing through an example. Let's say we ask a question to Question Answering (QA) system, such as Amazon's Alexa or Apple's Siri, the following question: "Who won the Cricket World Cup in 2015?" The QA system can respond meaningfully only…
Demystifying Part-of-Speech (POS) Tagging Techniques for Accurate Language Analysis
Part-of-speech (POS) tagging is a natural language processing (NLP) technique that involves assigning specific grammatical labels (such as nouns, verbs, adjectives, adverbs, etc.) to individual words within a sentence. This procedure helps to decipher word meanings, comprehend word relationships, and facilitates a variety of linguistic and computational studies of textual data. It also offers insights into the syntactic structure of the text. In this post, we will cover various techniques that…
Unleashing the Power of Advanced Lexical Processing: Exploring Phonetic Hashing, Minimum Edit Distance, and PMI Score
In the post, Demystifying NLP: Exploring Lexical, Syntactic, and Semantic Processing for Powerful Natural Language Understanding under the "Lexical Processing" section we have explored some of the basic techniques including Word Frequencies, Stop words removal, Tokenization, Bag-of-Words Representation, Stemming, Lemmatization, and TF-IDF Representation for lexical processing. If you are not sure about the above-mentioned techniques or want to revise the topics, I would highly encourage you to…
Diving Deep into Topic Modeling: Understanding and Applying NLP's Powerful Tool
In the vast sea of digital information, making sense of unstructured text data has become a paramount challenge. In this blog post, we will embark on a journey to unravel the mysteries of Topic Modeling, delving deep into its applications, inner workings, and the transformative impact it can have on understanding, organizing, and extracting meaning from large volumes of text. Topic Modeling stands as an invaluable asset in the realm of data science, especially for deciphering the underlying…