Fine-tuning AraBERT for multi-label text classification #67
-
Hello, I am following the example here, to fine-tune AraBERT for Multi-label Classification.
I could not figure out how to pass the labels.. Do I need to do one hot encoding of the labels? Any ideas? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I don't think this notebook works for your case. But here is an old notebook that implements multi-label classification with BERT. https://github.com/abhimishra91/transformers-tutorials/blob/master/transformers_multi_label_classification.ipynb you need to modify it a bit to add the AraBERT preprocessing. |
Beta Was this translation helpful? Give feedback.
I don't think this notebook works for your case. But here is an old notebook that implements multi-label classification with BERT. https://github.com/abhimishra91/transformers-tutorials/blob/master/transformers_multi_label_classification.ipynb
you need to modify it a bit to add the AraBERT preprocessing.