Skip to content

Conversation

@rrutmann
Copy link
Collaborator

@rrutmann rrutmann commented Dec 10, 2025

What does this PR do?

This PR gives a unique model seed for each pp rank, such that parameters are initialized differently across ranks.

General Changes

  • On each rank, add the pp rank to the model seed.

Breaking Changes

  • None

Checklist before submitting final PR

  • My PR is minimal and addresses one issue in isolation
  • I have merged the latest version of the target branch into this feature branch
  • I have reviewed my own code w.r.t. correct implementation, missing type hints, proper documentation, etc.
  • I have run a sample config for model training
  • I have checked that all tests run through (python tests/tests.py)
  • I have updated the internal changelog (CHANGELOG_DEV.md)

Copy link
Collaborator

@BlueCrescent BlueCrescent left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM.
Should we also explicitly allow seeding for the "model_initialized" component?
It will probably inherit the random state from the model_raw component but it seems a bit risky to me to assume that (also in the future) no other interaction with the random state happens between these two components (though, probably, only interactions that are asymmetrical between the ranks would be problematic). In particular, since we cannot guarantee the order in which the components are build, something like a dataloader component might even re-seed the random state.

@rrutmann rrutmann requested a review from le1nux December 19, 2025 10:25
@rrutmann rrutmann self-assigned this Dec 19, 2025
Copy link
Member

@le1nux le1nux left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I checked the seeding (not the test) and from my understanding the changes do not provide the expected results (also what @BlueCrescent was hinting towards).

When we seed the raw model, the model weights are indeed deterministic at instantiation time. However, we also have model weight initialization which runs afterwards and would just override the weights / seeding.

Additionally, passing device_mesh to the model is coupling two components that should normally not know anything about each other.

I think we have to integrate the seeding to the weight initializer component and can also pass in the device_mesh there.

@rrutmann
Copy link
Collaborator Author

I checked the seeding (not the test) and from my understanding the changes do not provide the expected results (also what @BlueCrescent was hinting towards).

When we seed the raw model, the model weights are indeed deterministic at instantiation time. However, we also have model weight initialization which runs afterwards and would just override the weights / seeding.

Additionally, passing device_mesh to the model is coupling two components that should normally not know anything about each other.

I think we have to integrate the seeding to the weight initializer component and can also pass in the device_mesh there.

Yes that makes sense. I moved the seeding to the model initialization component

@rrutmann
Copy link
Collaborator Author

Overall LGTM. Should we also explicitly allow seeding for the "model_initialized" component? It will probably inherit the random state from the model_raw component but it seems a bit risky to me to assume that (also in the future) no other interaction with the random state happens between these two components (though, probably, only interactions that are asymmetrical between the ranks would be problematic). In particular, since we cannot guarantee the order in which the components are build, something like a dataloader component might even re-seed the random state.

See #426 (comment)

Copy link
Collaborator

@BlueCrescent BlueCrescent left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants