-
1Academic Journal
المؤلفون: Somol, Petr, Novovičová, Jana, Pudil, Pavel
مصطلحات موضوعية: keyword:feature selection, keyword:branch & bound, keyword:sequential search, keyword:mixture model, msc:62G05, msc:62H30, msc:65C60, msc:68T10
وصف الملف: application/pdf
Relation: mr:MR2376333; zbl:Zbl 1134.62041; reference:[1] Das S.: Filters, wrappers and a boosting-based hybrid for feature selection.In: Proc. 18th Internat. Conference Machine Learning, 2001, pp. 74–81; reference:[2] Dash M., Choi K., Scheuermann, P., Liu H.: Feature selection for clustering – a Filter solution.In: Proc. Second Internat. Conference Data Mining, 2002, pp. 15–122; reference:[3] Devijver P. A., Kittler J.: Pattern Recognition: A Statistical Approach.Prentice-Hall, Englewood Cliffs, NJ 1982 Zbl 0542.68071, MR 0692767; reference:[4] Ferri F. J., Pudil P., Hatef, M., Kittler J.: Comparative study of techniques for large-scale feature selection.In: Pattern Recognition in Practice IV (E. S. Gelsema and L. N. Kanal, eds.), Elsevier Science B.V., 1994, pp. 403–413; reference:[5] Fukunaga K.: Introduction to Statistical Pattern Recognition.Academic Press, New York 1990 Zbl 0711.62052, MR 1075415; reference:[6] Graham M. W., Miller D. J.: Unsupervised learning of parsimonious mixtures on large spaces with integrated feature and component selection.IEEE Trans. Signal Process. 54 (2006), 4, 1289–1303; reference:[7] Hodr R., Nikl J., Řeháková B., Veselý, A., Zvárová J.: Possibilities of a prognostic assessment quoad vitam in low birth weight newborns.Acta Facult. Med. Univ. Brunesis 58 (1977), 345–358; reference:[8] Chen X.: An improved branch and bound algorithm for feature selection.Pattern Recognition Lett. 24 (2003), 12, 1925–1933; reference:[9] Jain A. K., Zongker D.: Feature selection: Evaluation, application and small sample performance.IEEE Trans. Pattern Anal. Mach. Intell. 19 (1997), 2, 153–158; reference:[10] Jain A. K., Duin R. P. W., Mao J.: Statistical pattern eecognition: A review.IEEE Trans. Pattern Anal. Mach. Intell. 22 (2000), 2, 4–37; reference:[11] Kohavi R., John G. H.: Wrappers for feature subset selection.Artificial Intelligence 97 (1997), 1–2, 273–324 Zbl 0904.68143; reference:[12] Kudo M., Sklansky J.: Comparison of algorithms that select features for pattern classifiers.Pattern Recognition 33 (2000), 1, 25–41; reference:[13] Law M. H., Figueiredo M. A. T., Jain A. K.: Simultaneous feature selection and clustering using mixture models.IEEE Trans. Pattern Anal. Mach. Intell. 26 (2004), 1154–1166; reference:[14] Liu H., Yu L.: Toward integrating feature selection algorithms for classification and clustering.IEEE Trans. Knowledge Data Engrg. 17 (2005), 491–502; reference:[15] Mayer H. A., Somol P., Huber, R., Pudil P.: Improving statistical measures of feature subsets by conventional and evolutionary approaches.In: Proc. 3rd IAPR Internat. Workshop on Statistical Techniques in Pattern Recognition, Alicante 2000, pp. 77–81 Zbl 0996.68593; reference:[16] McKenzie P., Alder M.: Initializing the EM Algorithm for Use in Gaussian Mixture Modelling.University of Western Australia, 1994; reference:[17] McLachlan G. J.: Discriminant Analysis and Statistical Pattern Recognition.Wiley, New York 1992 Zbl 1108.62317, MR 1190469; reference:[18] McLachlan G. J., Peel D.: Finite Mixture Models.Wiley, New York 2000 Zbl 0963.62061, MR 1789474; reference:[19] Murphy P. M., Aha D. W.: UCI Repository of Machine Learning Databases [ftp.ics.uci.edu]. University of California, Depart ment of Information and Computer Science, Irvine 1994; reference:[20] Narendra P. M., Fukunaga K.: A branch and bound algorithm for feature subset selection.IEEE Trans. Computers 26 (1977), 917–922; reference:[21] Novovičová J., Pudil, P., Kittler J.: Divergence based feature selection for multimodal class densities.IEEE Trans. Pattern Anal. Mach. Intell. 18 (1996), 2, 218–223; reference:[22] Novovičová J., Pudil P.: Feature selection and classification by modified model with latent structure.In: Dealing With Complexity: Neural Network Approach, Springer–Verlag, Berlin 1997, pp. 126–140; reference:[23] Pudil P., Novovičová, J., Kittler J.: Floating search methods in feature selection.Pattern Recognition Lett. 15 (1994), 11, 1119–1125; reference:[24] Pudil P., Novovičová, J., Kittler J.: Feature selection based on approximation of class densities by finite mixtures of special type.Pattern Recognition 28 (1995), 1389–1398; reference:[25] Pudil P., Novovičová, J., Kittler J.: Simultaneous learning of decision rules and important attributes for classification problems in image analysis.Image Vision Computing 12 (1994), 193–198; reference:[26] Sardo L., Kittler J.: Model complexity validation for PDF estimation using Gaussian mixtures.In: Proc. 14th Internat. Conference on Pattern Recognition, Vol. 2, 1998, pp. 195–197; reference:[27] Sebban M., Nock R.: A Hybrid filter/wrapper approach of feature selection using information theory.Pattern Recognition 35 (2002), 835–846 Zbl 0997.68115; reference:[28] Siedlecki W., Sklansky J.: On automatic feature selection.Internat. J. Pattern Recognition Artif. Intell. 2 (1988), 2, 197–220; reference:[29] Somol P., Pudil P., Novovičová, J., Paclík P.: Adaptive floating search methods in feature selection.Pattern Recognition Lett. 20 (1999), 11 – 13, 1157–1163; reference:[30] Somol P., Pudil P.: Oscillating search algorithms for feature selection.In: Proc. 15th IAPR Internat. Conference on Pattern Recognition, 2000, pp. 406–409; reference:[31] Somol P., Pudil P.: Feature Selection Toolbox.Pattern Recognition 35 (2002), 12, 2749–2759 Zbl 1029.68606; reference:[32] Somol P., Pudil. P., Kittler J.: Fast branch & bound algorithms for optimal feature selection.IEEE Trans. Pattern Anal. Mach. Intell. 26 (2004), 7, 900–912; reference:[33] Somol P., Pudil, P., Grim J.: On prediction mechanisms in fast branch & bound algorithms.In: Lecture Notes in Computer Science 3138, Springer–Verlag, Berlin 2004, pp. 716–724 Zbl 1104.68694; reference:[34] Somol P., Novovičová, J., Pudil P.: Flexible-hybrid sequential floating search in statistical feature selection.In: Lecture Notes in Computer Science 4109, Springer–Verlag, Berlin 2006, pp. 632–639; reference:[35] Theodoridis S., Koutroumbas K.: Pattern Recognition.Second edition. Academic Press, New York 2003 Zbl 1093.68103; reference:[36] Wang Z., Yang, J., Li G.: An improved branch & bound algorithm in feature selection.In: Lecture Notes in Computer Science 2639, Springer, Berlin 2003, pp. 549–556 Zbl 1026.68591; reference:[37] Webb A.: Statistical Pattern Recognition.Second edition. Wiley, New York 2002 Zbl 1237.68006, MR 2191640; reference:[38] Yu B., Yuan B.: A more efficient branch and bound algorithm for feature selection.Pattern Recognition 26 (1993), 883–889; reference:[39] Yu L., Liu H.: Feature selection for high-dimensional data: A fast correlation-based filter solution.In: Proc. 20th Internat. Conf. Machine Learning, 2003, pp. 856–863; reference:[40] Benda J. Zvárová a J.: Systém programů TIBIS.Ústav hematologie a krevní transfuze, Praha 1975 (in Czech); reference:[41] Zvárová J., Perez A., Nikl, J., Jiroušek R.: Data reduction in computer-aided medical decision-making.In: MEDINFO 83 (J. H. van Bemmel, M. J. Ball, and O. Wigertz, eds.), North Holland, Amsterdam 1983, pp. 450–453; reference:[42] Zvárová J., Studený M.: Information theoretical approach to constitution and reduction of medical data.Internat. J. Medical Informatics 45 (1997), 1 – 2, 65–74