Hybrid Gradient Descent Grey Wolf Optimizer for Machine Learning Performance Enhancement

  • Sri Rossa Aisyah Puteri Baharie Universitas Ahmad Dahlan
  • Sugiyarto Surono Universitas Ahmad Dahlan
  • Aris Thobirin Universitas Ahmad Dahlan
Keywords: Hybrid Gradient Descent Grey Wolf Optimizer, Hyperparameter Optimization, Diabetes Prediction, Machine Learning, Support Vector Machine (SVM)

Abstract

Advancements in machine learning have enabled the development of more accurate and efficient health prediction models. This study aims to improve diabetes prediction performance using the Support Vector Machine (SVM) model optimized with the Hybrid Gradient Descent Gray Wolf Optimizer (HGD-GWO) method. SVM is a robust machine learning algorithm for classification and regression. Still, its performance depends significantly on selecting appropriate hyperparameters such as regularization (C), kernel coefficient (γ), and polynomial kernel degree (d). The HGD-GWO method synergizes gradient descent for local optimization and the Gray Wolf Optimizer for global solution exploration. Using the Pima Indians Diabetes dataset, the process includes normalization, hyperparameter optimization, data division, and performance evaluation using accuracy, precision, recall, and F1-score metrics. The optimized SVM achieved an accuracy of 81.17%, with precision, recall, and F1-score values of 75.00%, 57.45%, and 65.06%, respectively, at a data ratio of 80%:20%. These findings highlight the potential of HGD-GWO in enhancing predictive models, particularly for early diabetes detection.

Downloads

Download data is not yet available.

References

M. Kumar and Bhawna, Introduction to Machine Learning, vol. 1169. 2024. doi: 10.1007/978-981-97-5624-7_2.

L. D. Avendaño-Valencia and S. D. Fassois, “Natural vibration response based damage detection for an operating wind turbine via Random Coefficient Linear Parameter Varying AR modelling,” J. Phys. Conf. Ser., vol. 628, no. 1, pp. 273–297, 2015, doi: 10.1088/1742-6596/628/1/012073.

S. Surono, T. Nursofiyani, and A. E. Haryati, “Optimization of Fuzzy Support Vector Machine (FSVM) Performance by Distance-Based Similarity Measure Classification,” HighTech Innov. J., vol. 2, no. 4, pp. 285–292, 2021, doi: 10.28991/HIJ-2021-02-04-02.

M. Zulfiqar, M. Kamran, M. B. Rasheed, T. Alquthami, and A. H. Milyani, “Hyperparameter optimization of support vector machine using adaptive differential evolution for electricity load forecasting,” Energy Reports, vol. 8, pp. 13333–13352, 2022, doi: 10.1016/j.egyr.2022.09.188.

M. E. Febrian, F. X. Ferdinan, G. P. Sendani, K. M. Suryanigrum, and R. Yunanda, “Diabetes prediction using supervised machine learning,” Procedia Comput. Sci., vol. 216, no. 2022, pp. 21–30, 2022, doi: 10.1016/j.procs.2022.12.107.

S. Surono, Z. A. R. Hsm, D. A. Dewi, A. E. Haryati, and T. T. Wijaya, “Implementation of Takagi Sugeno Kang Fuzzy with Rough Set Theory and Mini-Batch Gradient Descent Uniform Regularization,” Emerg. Sci. J., vol. 7, no. 3, pp. 791–798, 2023, doi: 10.28991/ESJ-2023-07-03-09.

A. Tapkir, “A Comprehensive Overview of Gradient Descent and its Optimization Algorithms,” Iarjset, vol. 10, no. 11, pp. 37–45, 2023, doi: 10.17148/iarjset.2023.101106.

Y. Tian, Y. Zhang, and H. Zhang, “Recent Advances in Stochastic Gradient Descent in Deep Learning,” Mathematics, vol. 11, no. 3, pp. 1–23, 2023, doi: 10.3390/math11030682.

H. Faris, I. Aljarah, M. A. Al-Betar, and S. Mirjalili, “Grey wolf optimizer: a review of recent variants and applications,” Neural Comput. Appl., vol. 30, no. 2, pp. 413–435, 2018, doi: 10.1007/s00521-017-3272-5.

S. S. Bagalkot, H. A. Dinesha, and N. Naik, “Novel grey wolf optimizer based parameters selection for GARCH and ARIMA models for stock price prediction,” PeerJ Comput. Sci., vol. 10, 2024, doi: 10.7717/peerj-cs.1735.

P. M. Kitonyi and D. R. Segera, “Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection,” Biomed Res. Int., vol. 2021, 2021, doi: 10.1155/2021/2555622.

M. Das, B. R. Mohan, R. M. R. Guddeti, and N. Prasad, “Hybrid Bio-Optimized Algorithms for Hyperparameter Tuning in Machine Learning Models: A Software Defect Prediction Case Study,” Mathematics, vol. 12, no. 16, pp. 1–31, 2024, doi: 10.3390/math12162521.

C. Fan, M. Chen, X. Wang, J. Wang, and B. Huang, “A Review on Data Preprocessing Techniques Toward Efficient and Reliable Knowledge Discovery From Building Operational Data,” Front. Energy Res., vol. 9, no. April, 2021, doi: 10.3389/fenrg.2021.652801.

O. Sami, Y. Elsheikh, and F. Almasalha, “The Role of Data Pre-processing Techniques in Improving Machine Learning Accuracy for Predicting Coronary Heart Disease,” Int. J. Adv. Comput. Sci. Appl., vol. 12, no. 6, pp. 816–824, 2021, doi: 10.14569/IJACSA.2021.0120695.

A. Ambarwari, Q. J. Adrian, and Y. Herdiyeni, “Analisis Pengaruh Data Scaling Terhadap Performa Algoritme Machine Learning untuk Identifikasi Tanaman,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 4, no. 1, pp. 117–122, 2020.

G. A. B. Suryanegara, Adiwijaya, and M. D. Purbolaksono, “Peningkatan Hasil Klasifikasi pada Algoritma Random Forest untuk Deteksi,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 1, no. 10, pp. 114–122, 2021.

M. Nishom, “Perbandingan Akurasi Euclidean Distance, Minkowski Distance, dan Manhattan Distance pada Algoritma K-Means Clustering berbasis Chi-Square,” J. Inform. J. Pengemb. IT, vol. 4, no. 1, pp. 20–24, 2019, doi: 10.30591/jpit.v4i1.1253.

A. Géron, Hands-on Machine Learning whith Scikit-Learing, Keras and Tensorfow. 2019.

S. Raschka, Y. (Hayden) Liu, and V. Mirjalili, Machine Learning with PyTorch and Scikit Learn. 2022.

J. A Ilemobayo et al., “Hyperparameter Tuning in Machine Learning: A Comprehensive Review,” J. Eng. Res. Reports, vol. 26, no. 6, pp. 388–395, 2024, doi: 10.9734/jerr/2024/v26i61188.

A. M. Helmi, M. A. A. Al-Qaness, A. Dahou, R. Damaševičius, T. Krilavičius, and M. A. Elaziz, “A novel hybrid gradient-based optimizer and grey wolf optimizer feature selection method for human activity recognition using smartphone sensors,” Entropy, vol. 23, no. 8, 2021, doi: 10.3390/e23081065.

Y. Hou, H. Gao, Z. Wang, and C. Du, “Improved Grey Wolf Optimization Algorithm and Application,” Sensors, vol. 22, no. 10, pp. 1–19, 2022, doi: 10.3390/s22103810.

P. Mazumder and D. S. Baruah, “A Hybrid Model for Predicting Classification Dataset based on Random Forest, Support Vector Machine and Artificial Neural Network,” Int. J. Innov. Technol. Explor. Eng., vol. 13, no. 1, pp. 19–25, 2023, doi: 10.35940/ijitee.a9757.1213123.

J. Xu, Y. Zhang, and D. Miao, “Three-way confusion matrix for classification: A measure driven view,” Inf. Sci. (Ny)., vol. 507, pp. 772–794, 2020, doi: 10.1016/j.ins.2019.06.064.

K. Riehl, M. Neunteufel, and M. Hemberg, “Hierarchical confusion matrix for classification performance evaluation,” J. R. Stat. Soc. Ser. C Appl. Stat., vol. 72, no. 5, pp. 1394–1412, 2023, doi: 10.1093/jrsssc/qlad057.

D. Chicco and G. Jurman, “The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation,” BMC Genomics, vol. 21, no. 1, pp. 1–14, 2020, doi: 10.1186/s12864-019-6413-7.

Published
2025-02-16
How to Cite
Puteri Baharie, S. R. A., Sugiyarto Surono, & Aris Thobirin. (2025). Hybrid Gradient Descent Grey Wolf Optimizer for Machine Learning Performance Enhancement. Jurnal RESTI (Rekayasa Sistem Dan Teknologi Informasi), 9(1), 146 - 152. https://doi.org/10.29207/resti.v9i1.6203
Section
Information Technology Articles