Skip to content

Latest commit

 

History

History
107 lines (90 loc) · 3.99 KB

README.md

File metadata and controls

107 lines (90 loc) · 3.99 KB

Overview

  • This is the code for EmoRecCom #1 solution (Tensorflow 2.0 version)
  • For usage of this code, please follow here
  • The ensemble models (TF + Pytorch) achieved 0.685 in the private leaderboard [paper]
Track 4 private leader board

Data preparation

Competition data

  • The data folder is organized as presented in here, you can also edit this file to adapt to your working directory (not recommended). Instead, this could be directly downloaded from drive by running setup.sh

  • The data directory by default is as follow:

├── private_test
│   ├── images
│   ├── readme.md
│   ├── results.csv
│   └── transcriptions.json
├── public_test
│   ├── images
│   ├── results.csv
│   └── transcriptions.json
└── public_train
    ├── additional_infor:train_emotion_polarity.csv
    ├── images
    ├── readme.md
    ├── train_5_folds.csv
    ├── train_emotion_labels.csv
    └── train_transcriptions.json

Additional data (optional)

  • In case you want to train a model with static word embeddings (word2vec, glove, fasttext, etc.). Download them by uncommenting the desired pretrained models in setup.sh. By default, static word embedding is not used in our approach
  • The provided static embedding models are in pickle file for easy loading, refer prepare_data.sh for more detail

Prerequisites

  • tensorflow
  • numpy
  • pandas
  • sklearn
  • transformers
  • efficientnet

Running setup.sh also installs the dependencies

Train & inference

  • Example bash scripts for training and inference are train.sh and infer.sh

Train example

python src/main.py \
    --train_dir data/public_train \
    --target_cols angry disgust fear happy sad surprise neutral other \
    --gpus 0 1 2 \
    --image_model efn-b2 \
    --bert_model roberta-base \
    --word_embedding embeddings/glove.840B.300d.pkl \
    --max_vocab 30000 \
    --image_size 256 \
    --max_word 36 \
    --max_len 48 \
    --text_separator " " \
    --n_hiddens -1 \
    --lr 0.00003 \
    --n_epochs 5 \
    --seed 1710 \
    --do_train \
    --lower \

Inference example

python src/main.py \
    --test_dir data/private_test \
    --target_cols angry disgust fear happy sad surprise neutral other \
    --gpus 1 \
    --ckpt_dir outputs/efn-b2_256_roberta-base_48_-1_0.1/ \
    --do_infer \
  • In addition, we perform stacking by Logistic Regression, requires out-of-fold along with test prediction

Outputs

Reproducing:


Model achitecture