Integrating Syntactic Embeddings with Transformer Models for Dependency Parsing Enhancement

Loading...
Thumbnail Image

Date

2024-05

Journal Title

Journal ISSN

Volume Title

Publisher

The Ohio State University

Research Projects

Organizational Units

Journal Issue

Abstract

The introduction of transformer models has created a revolutionary shift in Natural Language Processing, setting new benchmarks across a wide range of linguistic tasks. However, dependency parsing, as a critical component for understanding syntactic relationships within sentences, has predominantly relied on supervised learning techniques, often employing neural models like LSTM for training. In this thesis, I present an engineering attempt aimed at enhancing the performance of the popular transformer model, BERT, in dependency parsing tasks. Unlike conventional methods that solely rely on transformers' ability to capture contextual information, this project proposes the incorporation of syntactic embeddings generated by non-transformer based parsers into BERT's input representations. This integration aims to blend explicit syntactic insights with the rich semantic understanding of transformer models. The resulting model demonstrates a slight improvement in overall parsing accuracy, with an increase in the LAS by approximately 1 percentage point over the traditional parser and 0.5 percentage point over the baseline BERT model.

Description

Keywords

computational linguistics, BERT, syntax, dependency parsing, transformer language model

Citation