Liu Dong, Weng Haiguang, Chen Yimin. A BERT-BIGRU-WCELOSS CLASSIFICATION MODEL FOR HANDING SEVERELY UNBALANCED SHORT ALERT TEXT DATA[J]. Computer Applications and Software, 2024, 41(9): 217-223,229. DOI: 10.3969/j.issn.1000-386x.2024.09.031
Citation: Liu Dong, Weng Haiguang, Chen Yimin. A BERT-BIGRU-WCELOSS CLASSIFICATION MODEL FOR HANDING SEVERELY UNBALANCED SHORT ALERT TEXT DATA[J]. Computer Applications and Software, 2024, 41(9): 217-223,229. DOI: 10.3969/j.issn.1000-386x.2024.09.031

A BERT-BIGRU-WCELOSS CLASSIFICATION MODEL FOR HANDING SEVERELY UNBALANCED SHORT ALERT TEXT DATA

  • In response to the problem of extremely short text length and severely imbalanced distribution of sample categories in 110 alarm text data, this paper proposes a BERT-BiGRU-WCELoss alarm classification model. The model extracted the semantics of the text through the Chinese pre trained BERT (Bidirectional Encoder Representations from Transformers) model. BiGRU (Bidirectional Gated Recurrent Unit) was used to comprehensively extract the semantic features of the text. By optimizing the adaptive weight loss function WCELoss (Weight Cross Entropy Loss function), larger loss weights were assigned to minority class samples. The experimental results show that the model achieved a classification accuracy of 95.83% on the 110 alarm dataset of a certain natural month in 2015 in a certain city, with higher accuracy, recall rate, F1 value, and G_Mean than traditional deep learning models and models trained with cross entropy loss.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return