Skip to content

Commit 0b64117

Browse files
committed
Take no_emb_class into account when calling resize_pos_embed
1 parent d7b55a9 commit 0b64117

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

timm/models/vision_transformer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -644,7 +644,7 @@ def checkpoint_filter_fn(state_dict, model, adapt_layer_scale=False):
644644
v = resize_pos_embed(
645645
v,
646646
model.pos_embed,
647-
getattr(model, 'num_prefix_tokens', 1),
647+
0 if getattr(model, 'no_embed_class') else getattr(model, 'num_prefix_tokens', 1),
648648
model.patch_embed.grid_size
649649
)
650650
elif adapt_layer_scale and 'gamma_' in k:

0 commit comments

Comments
 (0)