Skip to content

[AAAI 2025] Neural-Symbolic Collaborative Distillation: Advancing Small Language Models for Complex Reasoning Tasks

Notifications You must be signed in to change notification settings

Xnhyacinth/NesyCD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NesyCD

🛠 Requirements

Install LLaMA-Factory following LLaMA-Factory.

cd NesyCD
pip install -e ".[torch,metrics]"

Install pyserini following pyserini.

💡 Data

  • Download the datasets from official websites.

  • From Google drive: (we unified the formats of the above datasets). Link

🤝 Referencing and Citing

If you find our work useful in your research and would like to cite our project, please use the following citation: found this work useful, please consider giving this repository a star and citing our paper as follows:

@article{liao2024neural,
  title={Neural-Symbolic Collaborative Distillation: Advancing Small Language Models for Complex Reasoning Tasks},
  author={Liao, Huanxuan and He, Shizhu and Xu, Yao and Zhang, Yuanzhe and Liu, Kang and Zhao, Jun},
  journal={arXiv preprint arXiv:2409.13203},
  year={2024}
}

About

[AAAI 2025] Neural-Symbolic Collaborative Distillation: Advancing Small Language Models for Complex Reasoning Tasks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published