GRAMMATICAL ERROR CORRECTION BY TRANSFERRING LEARNING BASED ON PRE-TRAINED LANGUAGE MODEL

Grammatical Error Correction by Transferring Learning Based on Pre-Trained Language Model

Grammatical Error Correction by Transferring Learning Based on Pre-Trained Language Model

Blog Article

Grammatical error correction (GEC) is a low-resource task, which requires annotations with high costs and is time consuming in training.In this paper, the MASS-GEC is proposed to solve this problem by transferring learning from a pre-trained language generation model, and masked LIP SHIMMER CARAMEL sequence is proposed to sequence pre-training for language generation (MASS).In addition, specific preprocessing and postprocessing strategies are applied to improve the performance of the GEC system.Finally, this system is evaluated on two public datasets and a competitive performance is achieved compared with the state-of-the-art work with limited resources.

Specifically, this system achieves 57.9 in terms of F0.5 score which emphasizes more on precision on the CoNLL2014 task.On the JFLEG task, the MASS-GEC achieves 59.

1 in terms of GLEU score which measures the n-gram coincidence between the output of the model and the correct answer manually annotated.This paper provides a new perspective that the low-resource problem in GEC can be solved well by transferring the general language knowledge from the self-supervised pre-trained Snap Ring language model.

Report this page