Skip to content

Conversation

@RishabSA
Copy link

@RishabSA RishabSA commented Jan 5, 2026

This PR resolves the warning from huggingface deprecation of the torch_dtype argument when loading a model from HuggingFace transformers. Replaced it with dtype whenever loading a model from transformers. Kept torch_dtype in kwargs since it automatically gets assigned to dtype when loading from TransformerLens.

Fixes #1093

Before:

hf_model = AutoModelForCausalLM.from_pretrained(
    official_model_name,
    torch_dtype=dtype,
    token=huggingface_token if len(huggingface_token) > 0 else None,
    **kwargs,
)

After:

hf_model = AutoModelForCausalLM.from_pretrained(
    official_model_name,
    dtype=dtype,
    token=huggingface_token if len(huggingface_token) > 0 else None,
    **kwargs,
)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)

Checklist:

  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have not rewritten tests relating to key interfaces which would affect backward compatibility

@RishabSA RishabSA closed this Jan 5, 2026
@RishabSA RishabSA deleted the dtype_fix branch January 5, 2026 22:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Update deprecated torch_dtype argument

1 participant