-
Hi, Have you used Global Context blocks or any Self-attention blocks in any of the networks? Any idea why people aren't using self-attention blocks more? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
@sowmen I had some work in progress impl w/ gcnet and genet (SE author) back in the fall but got sidetracked by a bunch of other things, and was low priority since nobody seemed to be using them. I don't actually remember how close I was to getting them up and runing. I do have quite a few models using channel attn layers like SE, ECA. I've recently implemented a bunch of more recent self-attn type layers w/ corresponding models, BotNet, HaloNet, LambdaLayers. I have a reecnt model class that's designed to make it easy to experiment with SA layers. https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/byoanet.py |
Beta Was this translation helpful? Give feedback.
-
@sowmen got stuck on another task so did some house cleaning and finished off those layers: #668 needs lots of testing.... |
Beta Was this translation helpful? Give feedback.
@sowmen I had some work in progress impl w/ gcnet and genet (SE author) back in the fall but got sidetracked by a bunch of other things, and was low priority since nobody seemed to be using them. I don't actually remember how close I was to getting them up and runing. I do have quite a few models using channel attn layers like SE, ECA. I've recently implemented a bunch of more recent self-attn type layers w/ corresponding models, BotNet, HaloNet, LambdaLayers.
I have a reecnt model class that's designed to make it easy to experiment with SA layers. https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/byoanet.py