Information calculation using Scikit-learn

I use Scikit-learn to classify text. I want to calculate the information retrieval rate for each attribute relative to the class in the (sparse) document matrix. The information coefficient is defined as H (class) - H (class | attribute), where H is entropy.

Using weka, this can be accomplished with InfoGainAttribute . But I did not find this measure in scikit-learn.

However, it was suggested that the above formula for obtaining information is the same measure as mutual information. It also matches the definition in wikipedia .

Is it possible to use a specific setting for mutual information in scikit-learn to perform this task?

+5
source share
1 answer

You can use scikit-learn mutual_info_classif here is an example

 from sklearn.datasets import fetch_20newsgroups from sklearn.feature_selection import mutual_info_classif from sklearn.feature_extraction.text import CountVectorizer categories = ['talk.religion.misc', 'comp.graphics', 'sci.space'] newsgroups_train = fetch_20newsgroups(subset='train', categories=categories) X, Y = newsgroups_train.data, newsgroups_train.target cv = CountVectorizer(max_df=0.95, min_df=2, max_features=10000, stop_words='english') X_vec = cv.fit_transform(X) res = dict(zip(cv.get_feature_names(), mutual_info_classif(X_vec, Y, discrete_features=True) )) print(res) 

this will output a dictionary of each attribute, that is, an element in the dictionary in the form of keys, and their information gain as values

here is an example output file

 {'bible': 0.072327479595571439, 'christ': 0.057293733680219089, 'christian': 0.12862867565281702, 'christians': 0.068511328611810071, 'file': 0.048056478042481157, 'god': 0.12252523919766867, 'gov': 0.053547274485785577, 'graphics': 0.13044709565039875, 'jesus': 0.09245436105573257, 'launch': 0.059882179387444862, 'moon': 0.064977781072557236, 'morality': 0.050235104394123153, 'nasa': 0.11146392824624819, 'orbit': 0.087254803670582998, 'people': 0.068118370234354936, 'prb': 0.049176995204404481, 'religion': 0.067695617096125316, 'shuttle': 0.053440976618359261, 'space': 0.20115901737978983, 'thanks': 0.060202010019767334} 
+5
source

Source: https://habr.com/ru/post/1272636/


All Articles