Academic Journals Database
Disseminating quality controlled scientific knowledge

On Multiplicative Entropy and Information gain in Large Data Sets

ADD TO MY LIST
 
Author(s): Udayan Ghose | C.S.Rai, | Yogesh Singh

Journal: International Journal of Engineering Science and Technology
ISSN 0975-5462

Volume: 2;
Issue: 3;
Start page: 187;
Date: 2010;
VIEW PDF   PDF DOWNLOAD PDF   Download PDF Original page

Keywords: Entropy | Uncertainty | Probability | Data Mining | Mutual Information.

ABSTRACT
Information theory is one of the widely used branches of applied probability theory. When probability is used to describe the state of a system implies that the state has some uncertainty. Some probabilitydistributions indicate more uncertainty than others as they are not created equal. We can come up with some mathematical entity which returns a measure of uncertainty after taking a probability distribution as input. It has been observed that the mutual information between two variables is the reduction in uncertainty of one variable due to information of other. In this paper a new approach is taken to look into the multiplicative nature of entropy and the conditional entropy. Then on the basis of this information gain is calculated using a large data set. The data set considered, are the scanned OMR application forms of the candidates applying in engineering courses of a University. Simulation has been done using such data and information gain is calculated using some predefined parameters.
Save time & money - Smart Internet Solutions     

Tango Jona
Tangokurs Rapperswil-Jona