Optimizing Neural Network Weights and Biases Using Particle Swarm Optimization for Classification Task

Authors

  • Made Agus Dwiputra Universitas Mataram
  • I Gede Pasek Suta Wijaya Universitas Mataram
  • I Gde Wirarama Wedashwara Universitas Mataram

DOI:

https://doi.org/10.38043/tiers.v6i1.6839

Keywords:

Particle Swarm Optimization (PSO), Classification Machine, Weight, Biases, Neural Network

Abstract

The current digital era greatly demands reliable automatic classification systems, especially to handle large and increasingly complex data volumes. One attractive alternative is the Particle Swarm Optimization (PSO) algorithm, which is recognized for its effective global search. Nevertheless, the performance of PSO for training artificial neural networks with complex or large-scale data is still uncertain. The primary purpose of this research is to create and assess a classification engine based on MLP, which uses a PSO algorithm to generate weights and biases. The assessment was made on three different types of data - dummy data, Iris data, and Sasak script images. For the dummy and Iris datasets, the model successfully achieved 100% accuracy, demonstrating the effectiveness of the PSO-MLP approach on simpler data. However, the results differed significantly for the more extensive and complex image dataset, where the model experienced a drastic decline in performance. In the image classification test with 6 classes, the model with one hidden layer achieved 71% accuracy, while the model with two hidden layers only reached 56%. For 12-class classification, accuracy dropped to 35% and 25%, respectively, and for 18 classes, the model achieved only 27% and 7%. These results indicate that while PSO is effective in optimizing perceptron weights and biases for smaller and simpler datasets, its ability to handle large-scale image classification with increasing complexity remains limited. Therefore, there is a need for optimization strategies to enhance the accuracy of optimization for more complex data.

Downloads

Download data is not yet available.

References

T. Jung and J. Kim, “A New Support Vector Machine for Categorical Features,” Expert Syst. Appl., vol. 229, p. 120449, 2023, doi: https://doi.org/10.1016/j.eswa.2023.120449.

Y. J. Krishna, G. Murari, M. Murali, and P. Raghu, “IRIS Flower Species Prediction Using Machine Learning and Web Based Interactive Tool for Non Technical Users,” pp. 1–6, 2025.

K. Su et al., “DctViT: Discrete Cosine Transform meet vision transformers,” Neural Networks, vol. 172, no. October 2023, 2024, doi: 10.1016/j.neunet.2024.106139.

C. Chen, N. A. Mat Isa, and X. Liu, “A Review of Convolutional Neural Network Based Methods for Medical Image Classification,” Comput. Biol. Med., vol. 185, p. 109507, 2025, doi: https://doi.org/10.1016/j.compbiomed.2024.109507.

T. O. Aro, H. B. Akande, K. S. Adewole, K. M. Aregbesola, and M. B. Jibrin, “Enhanced Textual Data Classification using Particle Swarm Optimization Algorithm,” J. ICT Dev. Appl. Res., vol. 2, no. April 2020, pp. 1–14, 2020.

M. Ali et al., “Brain Tumor Detection and Classification Using PSO and Convolutional Neural Network,” Comput. Mater. Contin., vol. 73, no. 3, pp. 4501–4518, 2022, doi: 10.32604/cmc.2022.030392.

K. Tang and C. Meng, “Particle Swarm Optimization Algorithm Using Velocity Pausing and Adaptive Strategy,” Symmetry, vol. 16, no. 6. 2024. doi: 10.3390/sym16060661.

J. C. Bastiaans, J. Hartojo, R. A. Pramunendar, and P. N. Andono, “Evaluating the Impact of Particle Swarm Optimization Based Feature Selection on Support Vector Machine Performance in Coral Reef Health Classification,” IJNMT (International J. New Media Technol., vol. 11, no. 2, pp. 90–99, Jan. 2025, doi: 10.31937/ijnmt.v11i2.3761.

L. I. LI, “Application of Artificial Neural Networks and Genetic Algorithm in Optimization of Concrete Shear Wall Design,” Int. J. Interact. Des. Manuf., vol. 18, no. 7, pp. 4775–4785, 2024, doi: 10.1007/s12008-024-01739-9.

P. Singh, S. Chaudhury, and B. K. Panigrahi, “Hybrid MPSO-CNN: Multi-level Particle Swarm Optimized Hyperparameters of Convolutional Neural Network,” Swarm Evol. Comput., vol. 63, p. 100863, 2021, doi: https://doi.org/10.1016/j.swevo.2021.100863.

X. Liu, C. Zhang, Z. Cai, J. Yang, Z. Zhou, and X. Gong, “Continuous Particle Swarm Optimization-Based Deep Learning Architecture Search for Hyperspectral Image Classification,” Remote Sensing, vol. 13, no. 6. 2021. doi: 10.3390/rs13061082.

D. Elhani, A. C. Megherbi, A. Zitouni, F. Dornaika, S. Sbaa, and A. Taleb-Ahmed, “Optimizing convolutional neural networks architecture using a modified particle swarm optimization for image classification,” Expert Syst. Appl., vol. 229, p. 120411, 2023, doi: https://doi.org/10.1016/j.eswa.2023.120411.

A. Ye, X. Zhou, and F. Miao, “Innovative Hyperspectral Image Classification Approach Using Optimized CNN and ELM,” Electronics, vol. 11, no. 5. 2022. doi: 10.3390/electronics11050775.

A. Rashno and S. Fadaei, “Convolutional Neural Networks Optimization Using Multi-Objective Particle Swarm Optimization Algorithm,” Inf. Sci. (Ny)., vol. 689, p. 121443, 2025, doi: https://doi.org/10.1016/j.ins.2024.121443.

W. Hussain et al., “Ensemble Genetic and CNN Model-Based Image Classification by Enhancing Hyperparameter Tuning,” Sci. Rep., vol. 15, no. 1, pp. 1–24, 2025, doi: 10.1038/s41598-024-76178-3.

L. Abualigah, “Particle Swarm Optimization: Advances, Applications, and Experimental Insights,” Computers, Materials & Continua , vol. 82, no. 2. 2025. doi: 10.32604/cmc.2025.060765.

Y. Zhou, “Study for Iris Classification Based on Multiple Machine Learning Models,” Highlights Sci. Eng. Technol., vol. 23, pp. 342–349, 2022, doi: 10.54097/hset.v23i.3620.

Y.-T. Jou, H.-L. Chang, and R. M. Silitonga, “Sustainable Optimization of the Injection Molding Process Using Particle Swarm Optimization (PSO),” Applied Sciences, vol. 15, no. 15. 2025. doi: 10.3390/app15158417.

Z. Zhao, C. Yang, Z. Qiu, and Q. Wu, “Discrete Cosine Transform-Based Joint Spectral–Spatial Information Compression and Band-Correlation Calculation for Hyperspectral Feature Extraction,” Remote Sensing, vol. 16, no. 22. 2024. doi: 10.3390/rs16224270.

Q. Zeng, B. Hui, Z. Liu, Z. Xu, and M. He, “A Method Combining Discrete Cosine Transform with Attention for Multi-Temporal Remote Sensing Image Matching,” Sensors, vol. 25, no. 5. 2025. doi: 10.3390/s25051345.

R. V. Nahari, A. S. Editya, and R. Alfita, “Ekstrasi Fitur Daun Tembakau Berbasis Discrete Cosine Transform (DCT),” J. Appl. Informatics Comput., vol. 4, no. 1, pp. 8–12, 2020, doi: 10.30871/jaic.v4i1.1756.

M. Maimouni, A. E. M. Badr, and M. and Bouya, “RFID Network Planning Using a New Hybrid ANNs-Based Approach,” Conn. Sci., vol. 34, no. 1, pp. 2265–2290, Dec. 2022, doi: 10.1080/09540091.2022.2115011.

I. Classification, “Texture Filter Optimization Using Particle Swarm Optimization for Efficient Lung Image Classification,” J. Popul. Ther. Clin. Pharmacol., vol. 30, no. 15, pp. 67–75, 2023, doi: 10.47750/jptcp.2023.30.15.007.

S. Lankford and D. Grimes, “Neural Architecture Search Using Particle Swarm and Ant Colony Optimization,” CEUR Workshop Proc., vol. 2771, pp. 229–240, 2020.

S. Rahnamayan, P. Mazaheri, and A. Asilian Bidgoli, “Designing Artificial Neural Network Using Particle Swarm Optimization: A Survey,” M. A. Aceves-Fernández, Ed., Rijeka: IntechOpen, 2022. doi: 10.5772/intechopen.106139.

S. Aote, M. M. Raghuwanshi, and L. Malik, “Brief Review on Particle Swarm Optimization: Limitations & Future Directions,” Int. J. Comput. Sci. Eng., vol. 2, pp. 196–200, Jan. 2013.

M. Kamachi, M. Lyons, and J. Gyoba, “The Japanese Female Facial Expression (JAFFE) Database,” Availble http//www. kasrl. org/jaffe. html, Jan. 1997.

Downloads

Published

2025-09-12

How to Cite

1.
Made Agus Dwiputra, Wijaya IGPS, Wedashwara IGW. Optimizing Neural Network Weights and Biases Using Particle Swarm Optimization for Classification Task. TIERS [Internet]. 2025Sep.12 [cited 2025Sep.15];6(1):112-28. Available from: https://journal.undiknas.ac.id/index.php/tiers/article/view/6839

Issue

Section

Articles