Skip to content

Latest commit

 

History

History
7 lines (5 loc) · 920 Bytes

File metadata and controls

7 lines (5 loc) · 920 Bytes

Arabic_BERT-Based_Dependency_Parsing

This repository contains the code for the paper “Parsing as Pretraining” by Vilares, David, et al., which proposes a novel method for improving the performance of natural language understanding models by using syntactic parsing as a pretraining task. The code also contains our updates to conduct our experiments for our article “Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing” , where we used it to evaluate it on different Arabic treebanks with different Arabic BERT models.

Colaboratory file

Finetune AraBERTV02 to perform sequence labeling dependency parsing on ArPoT and PADT treebanks. HERE