# Naive Bayes Classifier : An Example with Numerical features

For understanding what is Naive Bayes classifier and how it works on the data samples with categorical features using an Example, you can have a look on this article.

From this article, you can learn how a Naive Bayes classifier works on the data samples with numerical features.

We are going to use following data samples of IRIS flowers dataset. Here X1 represents Sepal length, X2 represents Sepal width, X3 represents Petal length and X4 represents Petal width.

Using these data samples we are going to predict the class of a sample given that X1 = 5.4, X2 = 3.9, X3 = 1.7 and X4 = 0.4 using Naive Bayes classifier. For that we have to calculate the posterior probability of each class. But as we discussed in the previous article, since we are going to calculate the posterior probabilities for same set of given conditions, we can compare the posterior probabilities using likelihood and prior probabilities only (Refer below equations).

From the given data we can easily calculate the prior probability of each class. Prior probability of each class is,

Now for calculating the likelihood probability, we are going to use 'Gaussian Probability Distribution'. Gaussian probability distribution function is given by,

Here we have to separate the data based on the classes.

After separating we have to find the standard deviation and mean of each feature of each class. For example,

Likewise we have to find the mean and standard deviation of other features of each class.

From these values, we have to calculate the likelihood probability for X1 = 5.4, X2 = 3.9, X3 = 1.7 and X4 = 0.4 using Gaussian probability distribution function. Likelihood probabilities are,

After calculating the prior and likelihood probabilities, we have to calculate the posterior probability of each class.

Probability of P(Setosa / X1 = 5.4, X2 = 3.9, X3 = 1.7, X4 = 0.4) is,

Probability of P(Versicolor / X1 = 5.4, X2 = 3.9, X3 = 1.7, X4 = 0.4) is,

By comparing those two posterior probabilities, we can say that the probability of Probability of P(Setosa / X1 = 5.4, X2 = 3.9, X3 = 1.7, X4 = 0.4) is higher than that of P(Versicolor / X1 = 5.4, X2 = 3.9, X3 = 1.7, X4 = 0.4).

Therefore using Naive Bayes classifier, we can predict the class of a sample is Setosa if it is given that X1 = 5.4, X2 = 3.9, X3 = 1.7 and X4 = 0.4. Likewise using Naive Bayes classifier, we can predict the class of a sample for any given set of feature values (X1, X2, X3 and X4).

## Related articles

### How does 'A Content Based Recommender System' work?

This article mainly focuses on the theory behind Content Based Recommender Systems and explains it using an Example.

### Visualizing the Output Images of the Convolutional Layers of a CNN

Have you ever wondered ‘How do the output images of the convolutional layers of a Convolutional Neural Network (CNN) look like?’. If yes, then this article is for you.

### Implementation of ANN for MNIST Handwritten Digits Classification

This article explains the basics of how to build an ANN model for the classification of MNIST dataset (Step by Step).