Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about Reproducibility #2

Open
wjwangppt opened this issue Jun 26, 2021 · 5 comments
Open

Question about Reproducibility #2

wjwangppt opened this issue Jun 26, 2021 · 5 comments

Comments

@wjwangppt
Copy link

wjwangppt commented Jun 26, 2021

Hello, I have read your paper on CIKM2020 and the code recently, it is an interesting work.
Congratulations!
But I have some problems about the reproducibility.
I tried to reproduce the results using the command you provide "python -m rlctr.main --dataset Cora --layers_of_child_model 3 --shared_initial_step 10 --shared_params True", but the result seems that far below ones you provide in the paper.
The pic below is the final architecture using best hyper params and retrained five times, the average test accuracy of these five result is 84.624, but the paper provided is 88.95.
image

@wei-ln
Copy link
Collaborator

wei-ln commented Jun 28, 2021

Hi,we provide an example to use this code in the introduction. The settings and reproduce results are shown in the following:

# SNAG-WS on Cora dataset
python -u -m rlctr.main  --dataset Cora    --shared_params True  --hyper_eval_inters 15  --layers_of_child_model 3  --shared_initial_step 10   --random_seed 333 --train_epochs 500 --epochs 800 --early_stop_epoch 800  --gnn_hidden 64 --weight_decay 0.001  --in_drop 0 --cos_lr True

1624846459(1)

@wjwangppt
Copy link
Author

Hi,we provide an example to use this code in the introduction. The settings and reproduce results are shown in the following:

# SNAG-WS on Cora dataset
python -u -m rlctr.main  --dataset Cora    --shared_params True  --hyper_eval_inters 15  --layers_of_child_model 3  --shared_initial_step 10   --random_seed 333 --train_epochs 500 --epochs 800 --early_stop_epoch 800  --gnn_hidden 64 --weight_decay 0.001  --in_drop 0 --cos_lr True

1624846459(1)

Thanks for your response ! I also wonder two questions.
1). whether the other two datasets "Citeseer" and "Pubmed" share the same params setting? If not, could you give the other two settings?
2). I noticed that the random_seed you set is "333", which impress data_split of "60% 20% 20%". I wonder whether datasets "Citeseer" and "Pubmed" also shared with the same random_seed?

@wei-ln
Copy link
Collaborator

wei-ln commented Jul 1, 2021

A1:

# citeseer dataset
python -u -m rlctr.main  --dataset Citeseer --shared_params True   --hyper_eval_inters 15  --layers_of_child_model 3  --shared_initial_step 10   --random_seed 333 --train_epochs 500 --epochs 600 --early_stop_epoch 600  --gnn_hidden 64 --in_drop 0 

# pubmed dataset    
python -u -m rlctr.main  --dataset Pubmed --shared_params True   --hyper_eval_inters 15  --layers_of_child_model 3  --shared_initial_step 10   --random_seed 333 --train_epochs 500 --epochs 600 --early_stop_epoch 600  --gnn_hidden 64  --weight_decay 0 --in_drop 0 --cos_lr True 

A2: All the methods have the same seed '333'.

@wjwangppt
Copy link
Author

A1:

# citeseer dataset
python -u -m rlctr.main  --dataset Citeseer --shared_params True   --hyper_eval_inters 15  --layers_of_child_model 3  --shared_initial_step 10   --random_seed 333 --train_epochs 500 --epochs 600 --early_stop_epoch 600  --gnn_hidden 64 --in_drop 0 

# pubmed dataset    
python -u -m rlctr.main  --dataset Pubmed --shared_params True   --hyper_eval_inters 15  --layers_of_child_model 3  --shared_initial_step 10   --random_seed 333 --train_epochs 500 --epochs 600 --early_stop_epoch 600  --gnn_hidden 64  --weight_decay 0 --in_drop 0 --cos_lr True 

A2: All the methods have the same seed '333'.

Thanks for your reply!!

@PestyVesty
Copy link

Hello, I was wondering if the parameter setting for the PPI dataset could also be shared?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants