Publish In |
International Journal of Advances in Electronics and Computer Science-IJAECS |
Journal Home Volume Issue |
||||||||
Issue |
Volume-9,Issue-7 ( Jul, 2022 ) | |||||||||
Paper Title |
Survey of Bert-Based Models for Question Answering | |||||||||
Author Name |
Praveen Thenraj Gunasekaran, Sabari Rajan K, Selvakuberan Karupasamy, Subhashini Lakshminarayanan | |||||||||
Affilition |
Accenture, Chennai, India | |||||||||
Pages |
9-12 | |||||||||
Abstract |
Abstract - Question-Answering is an integral part of day-to-day human activity. Humans with their blessed ability to handle reasoning of questions, understanding contextuality and ability to handle semantics perform this task easily. With advancements in Natural Language Processing and transfer learning, machines have come way forward in solving extractive Question Answering tasks in a much more effective way. There are several transformer based pre-trained models coming up which are solving this task with increased effectiveness. In this paper, we perform experiments to compare some of the pre-trained BERT models that solve the extractive QA tasks and try to identify the best performing BERT model for a given dataset. We have used Exact match (EM) and F1-score as the metric to evaluate these pre-trained models. Keywords - Natural Language Processing, Extractive Question-Answering, Pre-Trained BERT Models, Transfer Learning | |||||||||
View Paper |