Skip to content

Commit bae949f

Browse files
seefunrwightman
authored andcommitted
fix attention_bias_cache in tinyvit
1 parent 0198a20 commit bae949f

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

timm/models/tiny_vit.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -206,6 +206,7 @@ def __init__(self, dim, key_dim, num_heads=8, attn_ratio=4, resolution=(14, 14))
206206
idxs.append(attention_offsets[offset])
207207
self.attention_biases = torch.nn.Parameter(torch.zeros(num_heads, len(attention_offsets)))
208208
self.register_buffer('attention_bias_idxs', torch.LongTensor(idxs).view(N, N), persistent=False)
209+
self.attention_bias_cache = {}
209210

210211
@torch.no_grad()
211212
def train(self, mode=True):

0 commit comments

Comments
 (0)