Naive Bayes
01 Jan 2016Baye’s Classifier

SImple rule to improve our predicition based on newly gathered evidence.

Simple Classifier Here we have one feature discrete feature with data classifed into 2 classes depending on value of the feature. Now to predict for a future case given value of X we can convert to probabilties and can then calculate P(classX) to be proptional to P(Xclass) * P(class)

Baye’s Rule We can now genralize the simplified classifier to many values of feature X and any number of classes.
Classifier for continious X
 For continious X we can first calculate the mean and standard deviation of the feature from our given data. Then we can assume one of the models for X. For example a gaussian model can be assumed.
 For each class we calculate the mean and standard deviation to arrive at a probabilty distribution for the same. Now using these we can calculate P(Xgiven class) and then calculate joint probabilties.
##Implementation
 Always split data into testing and training data. Testing on training data would give results way better than actual ones.
 Implementation in sklearn of GaussianNB