Other Articles by Author(s)

Saleh M. Abu-Soud
Sufyan Almajali

Author(s) and WSEAS

Saleh M. Abu-Soud
Sufyan Almajali

WSEAS Transactions on Systems and Control

Print ISSN: 1991-8763
E-ISSN: 2224-2856

Volume 13, 2018

Notice: As of 2014 and for the forthcoming years, the publication frequency/periodicity of WSEAS Journals is adapted to the 'continuously updated' model. What this means is that instead of being separated into issues, new papers will be added on a continuous basis, allowing a more regular flow and shorter publication times. The papers will appear in reverse order, therefore the most recent one will be on top.

Volume 13, 2018

ILA-3: An Inductive Learning Algorithm with a New Feature Selection Approach

AUTHORS: Saleh M. Abu-Soud, Sufyan Almajali

Download as PDF

ABSTRACT: ILA is an inductive learning algorithm proved itself as a powerful algorithm in inductive learning community. It generates the less number of simplest rules compared with other similar algorithms with 100% accuracy; i.e. the generated rules cover all examples in the dataset. But, it becomes inefficient for large datasets. In this paper, a new generation of ILA, called ILA-3, has been developed and tailored with a new feature selection algorithm. This approach takes into consideration the way ILA works and excludes the most irrelevant features from the dataset under consideration while ILA is running, which yields to smaller datasets. Experiments show that significant efficiency improvements (that reached 30% on average) have been gained with ILA-3 over the original ILA with keeping accuracy with acceptable levels

KEYWORDS: ILA, ILA-3, Feature selection, Inductive learning, Irrelevant features.


[1] M. Tolun, and S. Abu-Soud. ILA: An Inductive Learning Algorithm for Rule Extraction, Expert Systems with Applications, 14(3), (1998) 361- 370.

[2] M.A. Hall, and , L.A. Smith, Practical Feature Subset Selection for Machine Learning. Proc Australian Computer Science Conference, 181- 191, Perth, Australia, 1998.

[3] L. Huan and Y. Lei. Toward Integrating Feature Selection Algorithms for Classification and Clustering, IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, VOL. 17, NO. 4, APRIL 2005.

[4] A.L. Blum and P. Langley, “Selection of Relevant Features and Examples in Machine Learning,” Artificial Intelligence, vol. 97, pp. 245-271, 1997.

[5] G.H. John, R. Kohavi, and K. Pfleger, “Irrelevant Feature and the Subset Selection Problem,” Proc. 11th Int’l Conf. Machine Learning, pp. 121-129, 1994.

[6] K. Kira and L.A. Rendell, “The Feature Selection Problem: Traditional Methods and a New Algorithm,” Proc. 10th Nat’l Conf. Artificial Intelligence, pp. 129-134, 1992.

[7] Y. Rui, T.S. Huang, and S. Chang, “Image Retrieval: Current Techniques, Promising Directions and Open Issues,” Visual Comm. and Image Representation, vol. 10, no. 4, pp. 39-62, 1999.

[8] D.L. Swets and J.J. Weng, “Efficient ContentBased Image Retrieval Using Automatic Feature Selection,” IEEE Int’l Symp. Computer Vision, pp. 85-90, 1995.

[9] M. Ben-Bassat, “Pattern Recognition and Reduction of Dimensionality, ”Handbook of Statistics-II, P.R. Krishnaiah and L.N. Kanal, eds., pp. 773-791, North Holland, 1982.

[10] A. Jain and D. Zongker, “Feature Selection: Evaluation, Application, and Small Sample Performance,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 2, 153- 158, Feb. 1997.

[11] M. Dash, K. Choi, P. Scheuermann, and H. Liu, “Feature Selection for Clustering-a Filter Solution,” Proc. Second Int’l Conf. Data Mining, pp. 115-122, 2002.

[12] M. Dash and H. Liu, “Feature Selection for Classification, ”Intelligent Data Analysis: An Int’l J., vol. 1, no. 3, pp. 131-156, 1997.

[13] K. Nigam, A.K. Mccallum, S. Thrun, and T. Mitchell, “Text Classification from Labeled and Unlabeled Documents Using EM,” Machine Learning, vol. 39, 103-134, 2000.

[14] Y. Yang and J.O. Pederson, “A Comparative Study on Feature Selection in Text Categorization,” Proc. 14th Int’l Conf. Machine Learning, pp. 412-420, 1997.

[15] K.S. Ng and H. Liu, “Customer Retention via Data Mining,” AI Rev., vol. 14, no. 6, pp. 569- 590, 2000.

[16] W. Lee, S.J. Stolfo, and K.W. Mok, “Adaptive Intrusion Detection: A Data Mining Approach,” AI Rev., vol. 14, no. 6, pp. 533-567,2000.

[17] S. Abu-Soud, PaSSIL: A New Keystroke Dynamics System for Password Strengthening Based on Inductive Learning. The WSEAS Transactions on Information Science and Applications, Volume 13, 2016.

[18] M.A. Hall, “Correlation-Based Feature Selection for Discrete and Numeric Class Machine Learning,” Proc. 17th Int’l Conf. Machine Learning, pp. 359-366, 2000.

[19] H. Liu and R. Setiono, “A Probabilistic Approach to Feature Selection-A Filter Solution,” Proc. 13th Int’l Conf. Machine Learning, pp. 319-327, 1996.

[20] L. Yu and H. Liu, “Feature Selection for HighDimensional Data: A Fast Correlation-Based Filter Solution,” Proc. 20th Int’l Conf. Machine Learning, pp. 856-863, 2003.

[21] R. Caruana and D. Freitag, “Greedy Attribute Selection,” Proc.11th Int’l Conf. Machine Learning, pp. 28-36, 1994.

[22] J.G. Dy and C.E. Brodley, “Feature Subset Selection and Order Identification for Unsupervised Learning,” Proc. 17th Int’l Conf. Machine Learning, pp. 247-254, 2000.

[23] Y. Kim, W. Street, and F. Menczer, “Feature Selection for Unsupervised Learning via Evolutionary Search,” Proc. Sixth ACM SIGKDD Int’l Conf. Knowledge Discovery and Data Mining, pp. 365-369, 2000.

[24] R. Kohavi and G.H. John, “Wrappers for Feature Subset Selection,” Artificial Intelligence, vol. 97, nos. 1-2, pp. 273-324, 1997.

[25] S. Das, “Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection,” Proc. 18th Int’l Conf. Machine Learning, pp. 74-81, 2001.

[26] A.Y. Ng, “On Feature Selection: Learning with Exponentially Many Irrelevant Features as Training Examples,” Proc. 15th Int’l Conf. Machine Learning, pp. 404-412, 1998.

[27] E. Xing, M. Jordan, and R. Karp, “Feature Selection for High-Dimensional Genomic Microarray Data,” Proc. 15th Int’l Conf. Machine Learning, pp. 601-608, 2001.

[28] R. Kohavi and G. John. Wrappers for feature subset selection, Artificial Intelligence 97, 1997, pp. 273-324.

[29] H. Almuallim and T. Dietterich. Learning with many irrelevant features, in: Proceedings AAAI91, Anaheim, CA (MIT Press, Cambridge, MA, 1991) 547-552.

[30] H. Almuallim and T. Dietterich. Learning Boolean concepts in the presence of many irrelevant features, Artificial Intelligence 69, 1994, 279-306.

[31] K. Kim and L.A. Rendell. The feature selection problem: Traditional methods and a new algorithm, in: Proceedings AAAI-92, San Jose, CA (MIT Press, MA, 1992) 129-134.

[32] K. Kim and L.A. Rendell. A practical approach to feature selection, in: Proceedings 9th International Conference on Machine Learning, Aberdeen, Scotland (Morgan Kaufmann, Los Altos, CA, 1994).

[33] I. Kononenko. Estimating attributes: analysis and extensions of Relief, in: F. Bergadano and L. De Raedt, eds., Proceedings European Conference on Machine Learning (1994).

[34] P. Langley, “Selection of Relevant Features in Machine Learning, ”Proc. AAAI Fall Symp. Relevance, pp. 140-144, 1994.

[35] G.H. John, R. Kohavi and K. Pfleger. Irrelevant features and the subset selection problem, in: Proceedings 11th International Conference on Machine Learning, New Brunswick, NJ (Morgan Kaufmann, San Mateo, CA, 1994) 121-129.

[36] P. Langley and S. Sage. Oblivious decision trees and abstract cases, in: Working Notes of the AAAI-94 Workshop on Case-Based Reasoning, Seattle, WA, 1994, 113-117.

[37] Abu-Soud S., “A Framework for Integrating Decision Support Systems and Expert Systems with Machine Learning”, Proceeding of the 10th International Conference on Industrial and Engineering Applications of AI and ES, June 1997, Atlanta, USA.

[38] Abu-Soud S., “A Disjunctive Learning Algorithm for Extracting General Rules”, Journal of Institute of Mathematics and Computer Science (Computer Science Series), Vol. 10, No. 2 (1999) 201-217.

[39] Oludag M., Tolun M., Sever H., and Abu-Soud S., “ILA-2: An Inductive Learning Algorithm for Knowledge Discovery”, Cybernetics and Systems: An International Journal, vol. 30, no. 7, Oct.-Nov. 1999.

[40] Haj Hassan M. and Abu-Soud S., “A Parallel Inductive Learning Algorithm,” AMSE journal, France, Dec. 2000.

[41] Abu-Soud S. and Al Ibrahim A., DRILA: A Distributed Relational Inductive Learning Algorithm, WSEAS Transactions on Computers, Issue 6, Volume 8, June 2009, ISSN: 1109-2750.

[42] Abu-Soud S., ILATalk: A New Multilingual Text-To-Speech Synthesizer with Machine Learning, International Journal of Speech Technology, Volume 19, March 2016, Issue 1, pp 55-64.

[43] J.R. Quinlan. Learning efficient classification procedures and their application to chess end games, in: R.S. Michalski, J.G. Carbonell and T.M. Mitchell, eds., Machine Learning: An Artificial Intelligence Approach (Morgan Kaufmann, San Mateo, CA, 1983).

[44] R.S. Michalski, I. Mozetic, J. Hong, and N. Lavrac. The Multipurpose Incremental Learning System AQ15 and Its Testing Application to Three Medical Domains, Proc. of the Fifth National Conference on Artificial Intelligence, Philadelphia, PA: Morgan Kaufmann, 1986, 1041-1045.

WSEAS Transactions on Systems and Control, ISSN / E-ISSN: 1991-8763 / 2224-2856, Volume 13, 2018, Art. #21, pp. 171-185

Copyright © 2018 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution License 4.0

Bulletin Board


The editorial board is accepting papers.

WSEAS Main Site