You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello author, thank you for your wonderful research. I have a small problem. As can be seen from the research results of your paper, swin transformer module has a great impact on the performance of your model. When I try to adopt swin transformer module, I find that your code is missing the relevant codes of self.weights. I used a simple linear layer instead, but the results are not ideal. I hope you can clarify the source code of your self.weights, looking forward to your reply.
The text was updated successfully, but these errors were encountered:
Thank you very much for your kind words and for your interest in our research!
I understand the importance of having all components of the model to replicate the results accurately. We have recently updated the repository, and it now includes a fully revised version of the code along with all the necessary model weights, including those related to the Swin Attention module.
Please pull the latest version of the repository, and you should be all set to replicate our findings. If you have any further questions or run into any other issues, don’t hesitate to reach out—we’re here to support you.
Thank you again for your patience and for your interest in advancing this research. We’re excited to see where your work takes you!
Hello author, thank you for your wonderful research. I have a small problem. As can be seen from the research results of your paper, swin transformer module has a great impact on the performance of your model. When I try to adopt swin transformer module, I find that your code is missing the relevant codes of self.weights. I used a simple linear layer instead, but the results are not ideal. I hope you can clarify the source code of your self.weights, looking forward to your reply.
The text was updated successfully, but these errors were encountered: