Big Data and Cognitive Computing, Volume 9, Issue 7 , 01/07/2025

Laor Initialization: A New Weight Initialization Method for the Backpropagation of Deep Learning

Laor Boongasame, Jirapond Muangprathub, Karanrat Thammarak

Abstract

This paper presents Laor Initialization, an innovative weight initialization technique for deep neural networks that utilizes forward-pass error feedback in conjunction with k-means clustering to optimize the initial weights. In contrast to traditional methods, Laor adopts a data-driven approach that enhances convergence’s stability and efficiency. The method was assessed using various datasets, including a gold price time series, MNIST, and CIFAR-10 across the CNN and LSTM architectures. The results indicate that the Laor Initialization achieved the lowest K-fold cross-validation RMSE (0.00686), surpassing Xavier, He, and Random. Laor demonstrated a high convergence success (final RMSE = 0.00822) and the narrowest interquartile range (IQR), indicating superior stability. Gradient analysis confirmed Laor’s robustness, achieving the lowest coefficients of variation (CV = 0.2230 for MNIST, 0.3448 for CIFAR-10, and 0.5997 for gold price) with zero vanishing layers in the CNNs. Laor achieved a 24% reduction in CPU training time for the Gold price data and the fastest runtime on MNIST (340.69 s), while maintaining efficiency on CIFAR-10 (317.30 s). It performed optimally with a batch size of 32 and a learning rate between 0.001 and 0.01. These findings establish Laor as a robust alternative to conventional methods, suitable for moderately deep architectures. Future research should focus on dynamic variance scaling and adaptive clustering.

Document Type

Article

Source Type

Journal

Keywords

deep learningk-means clusteringLaor Initializationneural networkweight initialization

ASJC Subject Area

Computer Science : Artificial IntelligenceComputer Science : Information SystemsComputer Science : Computer Science ApplicationsBusiness, Management and Accounting : Management Information Systems

Funding Agency

King Mongkut's Institute of Technology Ladkrabang


Bibliography


Boongasame, L., Muangprathub, J., & Thammarak, K. (2025). Laor Initialization: A New Weight Initialization Method for the Backpropagation of Deep Learning. Big Data and Cognitive Computing, 9(7) doi:10.3390/bdcc9070181

Copy | Save