Naive Bayesian Classifier and Django

Sat 15 Nov 2008 03:47 PM

I am currently taking an introductory course to artificial intelligence. I find it really interesting. I have learned more about search algorithms than I could have ever imagined. Recently we were instructed to create some artificial intelligence agent to do any task, as long as it uses some element of our class. Thus News Pet was created.

News Pet is a trainable (hence the name) news fetcher, that hopefully will be like Pandora for news. The scope of this project has pretty limited features (mostly a proof of concept at this point), but I will offer it up as food for thought. Often people have lots of RSS feeds that they subscribe to, but they do not want to read all of the items that come from these feeds, either that or they want to categorize them in some way. NewsPet categorizes or trashes all the articles that come through your reader.

We will be using Naive Bayesian Classification, which I will discuss briefly later for any one who is unfamiliar with Bayesian Classification. Our project has three important parts: Classification, Initial Learning, and Feedback (or Subsequent Learning).

Bayesian Network and Classification

Bayesian networks are unidirectional graphs using probability, specifically Bayes Theorm, to help calculate the likeliness of events based on partial evidence. The way the graph works is an edge between node A and B represents P (A`|`B), or the probability distribution of A given B. So lets give an example. You have a nuclear plant with a core, we have the temperature of the core (T), the reading from the temperature gauge or the perceived temperature (G), the probability of the the gauge being faulty (F_g), the probability of the alarm going off (A), and the probability that the alarm is faulty(F_a). The alarm is more likely to go off if the perceived temperature is high, and the gauge is more likely to be faulty if the temperature is high. So lets look at the graph.

/media/img/BayesNetwork.jpg

Naive Bayesian Classifiers (NBC) use Bayesian networks to classify documents. It is often used for spam filtering. It is trained by taking a batch of documents and giving it to the NBC and telling it if each document should be accepted or should not be accepted. The NBC then builds a Bayesian network. When you give a document to an NBC it uses that document as evidence and it can give you a confidence value of whether the document should be accepted.

Classification

Our approach to classification is nothing special. There is a list of feeds and there are a list of categories. Users create the categories and they all have an NBC associated with them. The way they are trained will be talked about in Initial Training and Feedback. Each item from the feeds will be tested against each of these categories. If the confidence is greater than some constant it will be added to that category. The default constant will be decided after some experimentation, and it will also be customizable. If the item fails to be added to any category, it will be added to the special category, Trash.

Initial Training

Because each category has its own Bayesian classifier it needs to be initially trained. We have several ideas for this right now: choose from some pre-trained categories (ie Business, Programming), submit a batch of files to train the classifier, or use a single word or phrase. The first two are fairly straight forward, but they are not extremely helpful, that being said it is impossible to train a Bayesian classifier with just one word. Because google is a reliable source for finding documents based on single words, we are going to use google search to retrieve a number of documents to train the Bayesian classifier. We will be doing experimentation to decide which to use, the final solution will probably be the ability to choose which training you would like.

Feedback

One important part of this feed reader is that its trainable. For all items you will have the choice to thumbs it up, thumbs it down, or say that it belongs to another category. Behind the scene thumbs up will and thumbs down will be more training cases for that category's NBC. When you move it to a new category, it becomes a training case for both category's NBCs. This means that the NBCs will always be changing and improving.

Django

Many of you may be skipping right to this section to see how Django has anything to do with this. I have convinced my group members that a Django powered website would be the best choice for the user interface. In the background doing the Bayesian classification will be Java. While this may be disappointing, it was for a couple of reasons: my project members do not know Python, and we couldn't find a good a good NBC library for python. If anyone knows of any, I would be interested in hearing about them, so leave a comment.