SbrPBert: A BERT-Based Model for Accurate Security Bug Report PredictionDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 16 May 2023DSN Workshops 2022Readers: Everyone
Abstract: Bidirectional Encoder Representation from Transformers (Bert) has achieved impressive performance in several Natural Language Processing (NLP) tasks. However, there has been limited investigation on its adaptation guidelines in specialized fields. Here we focus on the software security domain. Early identification of security-related reports in software bug reports is one of the essential means to prevent security accidents. However, the prediction of security bug reports (SBRs) is limited by the scarcity and imbalance of samples in this field and the complex characteristics of SBRs. So motivated, we constructed the largest dataset in this field and proposed a Security Bug Report Prediction Model Based on Bert (SbrPBert). By introducing a layer-based learning rate attenuation strategy and a fine-tuning method for freezing some layers, our model outperforms the baseline model on both our dataset and other small-sample datasets. This means the practical value of the model in BUG tracking systems or projects that lack samples. Moreover, our model has detected 56 hidden vulnerabilities through deployment on the Mozilla and RedHat projects so far.
0 Replies

Loading