Universiti Teknologi Malaysia Institutional Repository

Transformer in mRNA degradation prediction.

Yit, Tan Wen and Hassan, Rohayanti and Zakaria, Noor Hidayah and Kasim, Shahreen and Moi, Sim Hiew and Khairuddin, Alif Ridzuan and Amnur, Hidra (2023) Transformer in mRNA degradation prediction. International Journal on Informatics Visualization, 7 (2). pp. 588-599. ISSN 2549-9904

[img] PDF
457kB

Official URL: http://dx.doi.org/10.30630/joiv.7.2.1165

Abstract

The unstable properties and the advantages of the mRNA vaccine have encouraged many experts worldwide in tackling the degradation problem. Machine learning models have been highly implemented in bioinformatics and the healthcare fieldstone insights from biological data. Thus, machine learning plays an important role in predicting the degradation rate of mRNA vaccine candidates. Stanford University has held an OpenVaccine Challenge competition on Kaggle to gather top solutions in solving the mentioned problems, and a multi-column root means square error (MCRMSE) has been used as a main performance metric. The Nucleic Transformer has been proposed by different researchers as a deep learning solution that is able to utilize a self-attention mechanism and Convolutional Neural Network (CNN). Hence, this paper would like to enhance the existing Nucleic Transformer performance by utilizing the AdaBelief or RangerAdaBelief optimizer with a proposed decoder that consists of a normalization layer between two linear layers. Based on the experimental result, the performance of the enhanced Nucleic Transformer outperforms the existing solution. In this study, the AdaBelief optimizer performs better than the RangerAdaBelief optimizer, even though it possesses Ranger’s advantages. The advantages of the proposed decoder can only be shown when there is limited data. When the data is sufficient, the performance might be similar but still better than the linear decoder if and only if the AdaBelief optimizer is used. As a result, the combination of the AdaBelief optimizer with the proposed decoder performs the best with 2.79% and 1.38% performance boost in public and private MCRMSE, respectively.

Item Type:Article
Uncontrolled Keywords:AdaBelief; optimizer; Transformer.
Subjects:T Technology > T Technology (General)
T Technology > T Technology (General) > T58.6-58.62 Management information systems
Divisions:Computing
ID Code:105942
Deposited By: Muhamad Idham Sulong
Deposited On:26 May 2024 09:20
Last Modified:26 May 2024 09:20

Repository Staff Only: item control page