کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
455766 | 695545 | 2013 | 16 صفحه PDF | دانلود رایگان |

Mutual Information (MI) has extensively been used as a measure of similarity or dependence between random variables (or parameters) in different signal and image processing applications. However, MI estimation techniques are known to exhibit a large bias, a high Mean Squared Error (MSE), and can computationally be very costly. In order to overcome these drawbacks, we propose here a novel fast and low MSE histogram-based estimation technique for the computation of entropy and the mutual information. By minimizing the MSE, the estimation avoids the error accumulation problem of traditional methods. We derive an expression for the optimal number of bins to estimate the MI for both continuous and discrete random variables. Experimental results from a speech recognition problem and a computer aided diagnosis problem show the power of the proposed approach in estimating the optimal number of selected features with enhanced classification results compared to existing approaches.
Figure optionsDownload as PowerPoint slideHighlights
► Robust estimation of entropy and mutual information from histograms is a challenging task.
► We derive a new approach for estimating the optimal number of histogram bins by minimizing the MSE.
► The proposed approach is useful in optimal feature selection and pattern recognition problems.
Journal: Computers & Electrical Engineering - Volume 39, Issue 3, April 2013, Pages 918–933