You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I read the paper, it was providing straight ideas about how does GFNT tackled the limitation of using Self-Attention with a Vision-Transformer in computations cost and complexity also, i would like to take step further to use it in Segmentation task but i will use Global Filter as Attention mechanism in Unet model similar to Attention Gate
Algorithm pseudocode
Inputs: $F_g$ - input feature dimension of the global filter, $F_l$ - input feature dimension of the local filter, $F_{int}$ - intermediate feature dimension, $dim$ - spatial dimensions
my Question is : based on Learning from Frequencies which means I kept learning of neural Net on Frequencies similar to Complex-Value NN how does your opinion on the algorithm i provide anything i misunderstood or not correct
The text was updated successfully, but these errors were encountered:
I think it might be interesting to try GFNet in UNet models. I think one of the most straightforward ideas is to directly replace (some of) spatial convs in UNet with our global filters. I am not familiar with Attention Gate, but I see it might be a bit strange to apply softmax on frequency-domain features. Maybe a sigmoid/tanh like Attention Gate is a better solution.
thank you so much for opinion i tried out that already as you said i got strange behave from the model is predicting right but some features lost somewhere which means Sofmax applied on Dim , i think i will try sigmoid better
thank you so much for opinion i tried out that already as you said i got strange behave from the model is predicting right but some features lost somewhere which means Sofmax applied on Dim , i think i will try sigmoid better
I read the paper, it was providing straight ideas about how does GFNT tackled the limitation of using Self-Attention with a Vision-Transformer in computations cost and complexity also, i would like to take step further to use it in Segmentation task but i will use Global Filter as Attention mechanism in Unet model similar to Attention Gate
Algorithm pseudocode
Inputs:$F_g$ - input feature dimension of the global filter, $F_l$ - input feature dimension of the local filter, $F_{int}$ - intermediate feature dimension, $dim$ - spatial dimensions
Outputs: filtered output, gate frequencies, weights frequencies
Function: AttentionFilter($g$ , $x$ )
Pseudocode:
Return:
The text was updated successfully, but these errors were encountered: