Optimasi JST Backpropagation dengan Adaptive Learning Rate Dalam Memprediksi Hasil Panen Padi

P Prihandoko(1*), Putrama Alkhairi(2),

(1) Universitas Gunadarma, Depok, Indonesia
(2) STIKOM Tunas Bangsa, Pematangsiantar, Indonesia
(*) Corresponding Author

Abstract


Artificial Neural Networks (ANN) with the Backpropagation algorithm have been widely applied across various domains, including data prediction tasks. However, one of the primary challenges in implementing Backpropagation is the selection of an optimal learning rate. A learning rate that is too high can lead to unstable convergence, while one that is too low can significantly slow down the training process. To address this issue, this study proposes an optimization of Backpropagation using an Adaptive Learning Rate through the implementation of the Adam optimizer. The objective of this research is to analyze the performance comparison between Standard Backpropagation and Backpropagation with the Adam optimizer in predicting rice harvest yields based on rainfall, temperature, and humidity variables. The dataset consists of 100 synthetic samples generated based on a normal distribution to resemble real-world data. The results show that the use of the Adam optimizer improves the performance of the ANN model compared to the Standard Backpropagation method. Model accuracy increased from 92.04% to 92.99%, while the values of loss, Mean Squared Error (MSE), and Root Mean Squared Error (RMSE) decreased significantly, indicating that the model optimized with Adam is more stable and yields lower prediction errors. Therefore, Adaptive Learning Rate optimization using the Adam optimizer is proven to be effective in enhancing both the accuracy and efficiency of ANN in data prediction tasks.

Full Text:

PDF

References


S. Setianingsih, M. U. Chasanah, Y. I. Kurniawan, And L. Afuan, “Implementation Of Particle Swarm Optimization In K-Nearest Neighbor Algorithm As Optimization Hepatitis C Classification Implementasi Particle Swarm Optimization Pada Algoritma K- Nearest Neighbor Untuk Optimasi Penentuan Klasifikasi,” Vol. 4, No. 2, Pp. 457–465, 2023.

B. Gunawan, M. E. Al-Rivan, P. S. Informatika, U. Multi, And D. Palembang, “2nd Mdp Student Conference (Msc) 2023 E-Issn: 2985-7406 Klasifikasi Jenis Beras Putih Menggunakan Cnn Residual Network Optimizer Sgd,” Pp. 128–132, 2023.

C. Karapataki And J. Adamowski, “Comparison Of Multivariate Regression And Artificial Neural Networks For Peak Urban Water-Demand Forecasting: Evaluation Of Different Ann Learning Algorithms,” J. Hydrol. Eng., Vol. 17, No. 7, Pp. 834–836, 2012, Doi: 10.1061/(Asce)He.1943-5584.0000472.

B. S. Laili, D. T. Utomo, And D. Wijanarko, “Implementasi Metode Backpropagation Neural Network Dalam Memprediksi Hasil Produksi Kedelai,” J. Teknol. Inf. Dan Terap., Vol. 10, No. 1, Pp. 1–6, 2023, Doi: 10.25047/Jtit.V10i1.145.

F. Yu And X. Xu, “A Short-Term Load Forecasting Model Of Natural Gas Based On Optimized Genetic Algorithm And Improved Bp Neural Network,” Appl. Energy, Vol. 134, Pp. 102–113, 2014, Doi: 10.1016/J.Apenergy.2014.07.104.

P. Alkhairi, A. P. Windarto, And M. M. Efendi, “Optimasi Lstm Mengurangi Overfitting Untuk Klasifikasi Teks Menggunakan Kumpulan Data Ulasan Film Kaggle Imdb,” Vol. 6, No. 2, Pp. 1142–1150, 2024, Doi: 10.47065/Bits.V6i2.5850.

S. Defit, A. P. Windarto, And P. Alkhairi, “Comparative Analysis Of Classification Methods In Sentiment Analysis: The Impact Of Feature Selection And Ensemble Techniques Optimization,” Telematika, Vol. 17, No. 1, Pp. 52–67, 2024, Doi: 10.35671/Telematika.V17i1.2824.

C. F. Chao And M. H. Horng, “The Construction Of Support Vector Machine Classifier Using The Firefly Algorithm,” Comput. Intell. Neurosci., Vol. 2015, 2015, Doi: 10.1155/2015/212719.

I. T. Hidayat, E. C. Djamal, And R. Ilyas, “Brain Computer Interface Untuk Aksi Memutar Lagu Terhadap Tiga Kondisi Emosional Menggunakan Spektral Daya Dan Adaptive Backpropagation,” Pp. 35–40, 2017.

H.-K. Jo, S.-H. Kim, And C.-L. Kim, “Proposal Of A New Method For Learning Of Diesel Generator Sounds And Detecting Abnormal Sounds Using An Unsupervised Deep Learning Algorithm,” Nucl. Eng. Technol., Vol. 55, No. 2, Pp. 506–515, 2022, Doi: 10.1016/J.Net.2022.10.019.

M. Ahmad Et Al., “A Lightweight Convolutional Neural Network Model For Liver Segmentation In Medical Diagnosis,” Vol. 2022, 2022.

W. N. Ismail, H. A. Alsalamah, M. M. Hassan, And E. Mohamed, “Auto-Har: An Adaptive Human Activity Recognition Framework Using An Automated Cnn Architecture Design,” Heliyon, Vol. 9, No. 2, P. E13636, 2023, Doi: 10.1016/J.Heliyon.2023.E13636.

N. Kesav, “Efficient And Low Complex Architecture For Detection And Classification Of Brain Tumor Using Rcnn With Two Channel Cnn,” J. King Saud Univ. - Comput. Inf. Sci., Vol. 34, No. 8, Pp. 6229–6242, 2022, Doi: 10.1016/J.Jksuci.2021.05.008.

B. Eidel, “Deep Cnns As Universal Predictors Of Elasticity Tensors In Homogenization,” Comput. Methods Appl. Mech. Eng., Vol. 403, P. 115741, 2023, Doi: 10.1016/J.Cma.2022.115741.

A. Naik, “Lung Nodule Classification Using Combination Of Cnn, Second And Higher Order Texture Features,” J. Intell. Fuzzy Syst., Vol. 41, No. 5, Pp. 5243–5251, 2021, Doi: 10.3233/Jifs-189847.

L. R. Chilakala, “Optimal Deep Belief Network With Opposition-Based Hybrid Grasshopper And Honeybee Optimization Algorithm For Lung Cancer Classification: A Dbnghhb Approach,” Int. J. Imaging Syst. Technol., Vol. 31, No. 3, Pp. 1404–1423, 2021, Doi: 10.1002/Ima.22515.

S. Kamada, T. Ichimura, A. Hara, And K. J. Mackin, “Adaptive Structure Learning Method Of Deep Belief Network Using Neuron Generation – Annihilation And Layer Generation,” Neural Comput. Appl., Vol. 0123456789, 2018, Doi: 10.1007/S00521-018-3622-Y.

S. Kamada And T. Ichimura, “An Adaptive Learning Method Of Deep Belief Network By Layer Generation Algorithm Hj Hj,” Pp. 2967–2970, 2016.

S. A. Balaji, “Feed Forward Back Propagation Neural Network Coupled With Rice Data Simulator For Prediction Of Rice Production In Tamilnadu,” Vol. 4, No. 5, Pp. 11–31, 2014.

M. Palakuru, S. Adamala, And H. B. Bachina, “Modeling Yield And Backscatter Using Satellite Derived Biophysical Variables Of Rice Crop Based On Artificial Neural Networks,” J. Agrometeorol., Vol. 22, No. 1, Pp. 41–47, 2020.

D. Kurniasari Et Al., “Implementation Of Artificial Neural Network ( Ann ) Using Backpropagation Algorithm By Comparing Four Activation,” Pp. 93–105, 2023.

P. Alkhairi, W. Wanayumini, And B. H. Hayadi, “Analysis Of The Adaptive Learning Rate And Momentum Effects On Prediction Problems In Increasing The Training Time Of The Backpropagation Algorithm,” Aip Conf. Proc., Vol. 3048, No. 1, P. 20049, 2024, Doi: 10.1063/5.0203374.

P. Alkhairi, E. R. Batubara, R. Rosnelly, W. Wanayaumini, And H. S. Tambunan, “Effect Of Gradient Descent With Momentum Backpropagation Training Function In Detecting Alphabet Letters,” Sinkron, Vol. 8, No. 1, Pp. 574–583, 2023, Doi: 10.33395/Sinkron.V8i1.12183.

M. Shoaib, “Intelligent Computing With Levenberg–Marquardt Backpropagation Neural Networks For Third-Grade Nanofluid Over A Stretched Sheet With Convective Conditions,” Arab. J. Sci. Eng., Vol. 47, No. 7, Pp. 8211–8229, 2022, Doi: 10.1007/S13369-021-06202-5.

O. N. A. Al-Allaf, “Improving The Performance Of Backpropagation Neural Network Algorithm For Image Compression/Decompression System,” J. Comput. Sci., Vol. 6, No. 11, Pp. 1347–1354, 2010, Doi: 10.3844/Jcssp.2010.1347.1354.

N. A. Setyadi, M. Nasrun, And C. Setianingsih, “Text Analysis For Hate Speech Detection Using Backpropagation Neural Network,” Proc. - 2018 Int. Conf. Control. Electron. Renew. Energy Commun. Iccerec 2018, Pp. 159–165, 2018, Doi: 10.1109/Iccerec.2018.8712109.

P. Alkhairi, E. R. Batubara, R. Rosnelly, W. Wanayaumini, And H. S. Tambunan, “Effect Effect Of Gradient Descent With Momentum Backpropagation Training Function In Detecting Alphabet Letters,” Sink. J. Dan Penelit. Tek. Inform., Vol. 8, No. 1, Pp. 574–583, 2023, Doi: 10.33395/Sinkron.V8i1.12183.

Y. Kurniawati, “Optimization Of Backpropagation Using Harmony Search For Gold Price Forecasting,” Pakistan J. Stat. Oper. Res., Vol. 18, No. 3, Pp. 589–599, 2022, Doi: 10.18187/Pjsor.V18i3.3915.

R. Poojary, “Comparative Study Of Model Optimization Techniques In Fine-Tuned Cnn Models,” Pp. 22–25, 2019.

Putrama Alkhairi And A. P. Windarto, “Classification Analysis Of Back Propagation-Optimized Cnn Performance In Image Processing,” J. Syst. Eng. Inf. Technol., Vol. 2, No. 1, Pp. 8–15, 2023, Doi: 10.29207/Joseit.V2i1.5015.

R. Leluc And F. Portier, “Sgd With Coordinate Sampling: Theory And Practice,” Vol. 23, Pp. 1–47, 2021, [Online]. Available: Http://Arxiv.Org/Abs/2105.11818

D. Irfan And T. S. Gunawan, “Comparison Of Sgd , Rmsprop , And Adam Optimation In Animal Classification Using Cnns,” 2nd Int. Conf. Infromation Sci. Anda Technol. Innov., 2023.




DOI: http://dx.doi.org/10.30645/jurasik.v10i1.887

DOI (PDF): http://dx.doi.org/10.30645/jurasik.v10i1.887.g861

Refbacks

  • There are currently no refbacks.



JURASIK (Jurnal Riset Sistem Informasi dan Teknik Informatika)
Published Papers Indexed/Abstracted By:

Jumlah Kunjungan : View My Stats