FEAST provides implementations of common mutual information based filter
feature selection algorithms, and an implementation of RELIEF. All
functions expect discrete inputs (except RELIEF, which does not depend
on the MIToolbox), and they return the selected feature indices. These
implementations were developed to help our research into the similarities
between these algorithms, and our results are presented in the following paper:
Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature SelectionIf you're interested in the datasets we used for that paper, you can download them here (28MB).
G.Brown, A.Pocock, M.Lujan, M.-J.Zhao
Journal of Machine Learning Research, vol 13, pages 27-66 (2012)
mim, mrmr, mifs, cmim, jmi, disr, cife, icap, condred, cmi, relief, fcbf, betagamma
>> size(data)
ans =
(569,30) %% denoting 569 examples, and 30 features
>> selectedIndices = feast('jmi',5,data,labels) %% selecting the top 5 features using the jmi algorithm
selectedIndices =
28
21
8
27
23
>> selectedIndices = feast('mrmr',10,data,labels) %% selecting the top 10 features using the mrmr algorithm
selectedIndices =
28
24
22
8
27
21
29
4
7
25
>> selectedIndices = feast('mifs',5,data,labels,0.7) %% selecting the top 5 features using the mifs algorithm with beta = 0.7
selectedIndices =
28
24
22
20
29
The library is written in ANSI C for compatibility with the MATLAB mex compiler,
except for MIM, FCBF and RELIEF, which are written in MATLAB/OCTAVE script.
If you wish to use MIM in a C program you can use the BetaGamma function with
Beta = 0, Gamma = 0, as this is equivalent to MIM (but slower than the other implementation).
MIToolbox is required to compile these algorithms, and these implementations
supercede the example implementations given in that package (they have more robust behaviour
when used with unexpected inputs).