Skip to content
/ CO-GAT Public

[TBD 2024] This is the code repo for our TBD‘2024 paper "Multi-Evidence based Fact Verification via A Confidential Graph Neural Network".

Notifications You must be signed in to change notification settings

NEUIR/CO-GAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Evidence based Fact Verification via A Confidential Graph Neural Network

Source code for our TBD paper : Multi-Evidence based Fact Verification via A Confidential Graph Neural Network

Click the links below to view our papers and checkpoints

If you find this work useful, please cite our paper and give us a shining star 🌟

@article{Lan2024MultiEvidenceBF,
  title={Multi-Evidence based Fact Verification via A Confidential Graph Neural Network},
  author={Yuqing Lan and Zhenghao Liu and Yu Gu and Xiaoyuan Yi and Xiaohua Li and Liner Yang and Ge Yu},
  journal={IEEE Transactions on Big Data},
  year={2024},
  url={https://api.semanticscholar.org/CorpusID:269899642}
}

Overview

CO-GAT designs an additional node representation masking mechanism before the graph reasoning modeling, which controls the evidence information flow into the graph reasoning model.

CO-GAT

Requirement

1. Install the following packages using Pip or Conda under this environment

Python==3.7
Pytorch
transformers
prettytable
scikit-learn
jsonlines
pandas

We provide the version file requirements.txt of all our used packages, if you have any problems configuring the environment, please refer to this document.

Reproduce CO-GAT

Download Code & Dataset

  • First, use git clone to download this project:
git clone https://github.com/NEUIR/CO-GAT
cd CO-GAT
  • Download link for FEVER
  • Download link for SCIFACT(CO-GAT).
  • Place the downloaded dataset in the data folder:
data/
├──fever/
│   ├── bert_train.json
│   ├── bert_dev.json
│   ├── bert_test.json
│   ├── bert_eval.json
│   ├── dev_eval.json
│   └── all_test.json
└──scifact/
    ├── prediction
    ├── corpus.jsonl
    ├── train_cogat.json
    ├── dev_cogat.json
    ├── claims_dev.json
    └── claim_test.json

Train CO-GAT

I will show you how to reproduce the results in the CO-GAT paper.

  • For the FEVER dataset: Go to the cogat-fever folder and train the CO-GAT model checkpoint:
cd cogat-fever
bash train_twostep.sh
  • For the SCIFACT dataset: Go to the cogat-scifact folder and train the CO-GAT model checkpoint:
cd cogat-scifact
bash train.sh

Evaluate Prediction Effectiveness

  • These experimental results are shown in Table 3 of our paper.
  • Go to the cogat-fever or cogat-scifact folder and evaluate model performance as follow:
bash test.sh
bash inference.sh

Results

The results are shown as follows.

  • FEVER
Model ACC F1
DEV CO_GAT(ELECTRA-base) 78.84 76.77
DEV CO_GAT(ELECTRA-large) 81.65 79.32
TEST CO_GAT(ELECTRA-base) 74.56 71.43
TEST CO_GAT(ELECTRA-large) 77.27 73.59
  • SCIFACT
Model PREC-S REC-S F1-S PREC-A REC-A F1-A
DEV CO_GAT(ELECTRA-base) 63.39 38.80 48.14 72.00 43.06 53.89
DEV CO_GAT(ELECTRA-large) 71.49 48.63 57.89 79.58 54.07 64.39
TEST CO_GAT(ELECTRA-base) 58.08 40.81 47.94 67.11 45.05 53.91
TEST CO_GAT(ELECTRA-large) 55.31 47.84 51.30 69.64 52.70 60.0

Contact

If you have questions, suggestions, and bug reports, please email:

lanyuqing@stumail.neu.edu.cn     

About

[TBD 2024] This is the code repo for our TBD‘2024 paper "Multi-Evidence based Fact Verification via A Confidential Graph Neural Network".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published