Skip to content

Replace ModuleSpec with Protocols for inputs to MLP#3084

Merged
chtruong814 merged 7 commits intoNVIDIA:mainfrom
nschank:mlp
Feb 10, 2026
Merged

Replace ModuleSpec with Protocols for inputs to MLP#3084
chtruong814 merged 7 commits intoNVIDIA:mainfrom
nschank:mlp

Conversation

@nschank
Copy link
Copy Markdown
Contributor

@nschank nschank commented Jan 26, 2026

What does this PR do ?

Introduces Protocols for the submodule inputs to MLP and friends.

Associated design doc: Typed ModuleSpec.pdf

I split up the submodule definitions for MLP from TEGroupedMLP because they have totally different APIs.

Note this fixes two bugs with Kitchen:

Contribution process

flowchart LR
    A[Pre-checks] --> B[PR Tests]
    subgraph Code Review/Approval
        C1[Expert Review] --> C2[Final Review]
    end
    B --> C1
    C2 --> D[Merge]
Loading

Pre-checks

  • I want this PR in a versioned release and have added the appropriate Milestone (e.g., Core 0.8)
  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

The following process is enforced via the CODEOWNERS file for changes into megatron/core. For changes outside of megatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.

For MRs into `main` branch

Feel free to message or comment the @mcore-oncall to help accelerate your merge into main. The less complex your PR is, the faster it will be approved and merged!

(Step 1): Add PR label Expert Review

(Step 2): Collect the expert reviewers reviews

  1. Attach the Expert Review label when your PR is ready for review.
  2. GitHub auto-assigns expert reviewers based on your changes. They will get notified and pick up your PR soon.

⚠️ Only proceed to the next step once all reviewers have approved, merge-conflict are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

(Step 3): Final Review

  1. Add Final Review label
  2. GitHub auto-assigns final reviewers based on your changes. They will get notified and pick up your PR soon.

(Optional Step 4): Cherry-pick into release branch

If this PR also needs to be merged into core_r* release branches, after this PR has been merged, select Cherry-pick to open a new PR into the release branch.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either eharper@nvidia.com or zijiey@nvidia.com.

Merging your PR

Any member of core-adlr and core-nemo will be able to merge your PR.

@nschank nschank requested review from a team as code owners January 26, 2026 23:13
@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot bot commented Jan 26, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@ko3n1g ko3n1g requested a review from a team January 26, 2026 23:14
@maanug-nv maanug-nv added enhancement New feature or request Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. complexity: medium labels Jan 27, 2026
@maanug-nv
Copy link
Copy Markdown
Contributor

/ok to test 51232dd

Copy link
Copy Markdown
Contributor

@yashaswikarnati yashaswikarnati left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm; thanks for catching few silent bugs in Kitchen layers :). just have some very minor comments

@nschank
Copy link
Copy Markdown
Contributor Author

nschank commented Jan 29, 2026

I fixed the lint issues! Also realized I had accidentally used tp instead of expt_tp in Kitchen's grouped layer, so fixed that. Finally, based on Yash's suggestion, I renamed to Grouped... for all the GroupedMLP protocols

@nschank nschank force-pushed the mlp branch 2 times, most recently from c5f0581 to bd154b2 Compare January 30, 2026 22:02
@chtruong814 chtruong814 added the needs-follow-up Issue needs follow-up label Jan 31, 2026
@yashaswikarnati
Copy link
Copy Markdown
Contributor

/ok to test bd154b2

@chtruong814 chtruong814 added the needs-follow-up Issue needs follow-up label Feb 8, 2026
auto-merge was automatically disabled February 9, 2026 00:55

Head branch was pushed to by a user without write access

@yaox12
Copy link
Copy Markdown
Member

yaox12 commented Feb 9, 2026

/ok to test a7582b6

@yaox12 yaox12 enabled auto-merge February 9, 2026 01:13
@yaox12 yaox12 added this pull request to the merge queue Feb 9, 2026
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Feb 9, 2026
@yaox12 yaox12 added this pull request to the merge queue Feb 10, 2026
@chtruong814 chtruong814 removed this pull request from the merge queue due to a manual request Feb 10, 2026
@chtruong814 chtruong814 added this pull request to the merge queue Feb 10, 2026
Merged via the queue into NVIDIA:main with commit 55198ba Feb 10, 2026
46 checks passed
@chtruong814 chtruong814 removed the needs-follow-up Issue needs follow-up label Feb 10, 2026
@nschank nschank deleted the mlp branch February 10, 2026 16:32
daiyaanarfeen pushed a commit to daiyaanarfeen/Megatron-LM that referenced this pull request Feb 23, 2026
BoxiangW pushed a commit to BoxiangW/Megatron-LM that referenced this pull request Mar 4, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

community-request complexity: medium enhancement New feature or request Final Review PR is in the "final review" stage

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants