Skip to content

Remove dead code.#13251

Merged
comfyanonymous merged 1 commit intomasterfrom
comfyanonymous-patch-2
Apr 2, 2026
Merged

Remove dead code.#13251
comfyanonymous merged 1 commit intomasterfrom
comfyanonymous-patch-2

Conversation

@comfyanonymous
Copy link
Copy Markdown
Member

No description provided.

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 2, 2026

📝 Walkthrough

Walkthrough

The PR removes the clip_stats_path parameter from the CLIPEmbeddingNoiseAugmentation constructor. The conditional logic that previously loaded clip mean and standard deviation tensors from a file or defaulted to zeros and ones has been replaced with unconditional initialization of these tensors as zero and one values respectively. These tensors are registered as non-persistent buffers named data_mean and data_std. The remaining methods remain unchanged.

🚥 Pre-merge checks | ❌ 3

❌ Failed checks (1 warning, 2 inconclusive)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Title check ❓ Inconclusive The title 'Remove dead code' is vague and generic, using a non-descriptive term that doesn't convey meaningful information about which code is being removed or why. Consider using a more specific title like 'Remove unused clip_stats_path parameter from CLIPEmbeddingNoiseAugmentation' to clarify the exact change.
Description check ❓ Inconclusive No description was provided by the author, making it impossible to assess whether a description exists or relates to the changeset. Add a description explaining why the clip_stats_path parameter is no longer needed and how the default initialization handles the use case.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@comfy/ldm/modules/encoders/noise_aug_modules.py`:
- Around line 6-7: The __init__ signature removed the clip_stats_path kwarg
causing callers to break; restore backward-compatible handling in the class
__init__ in comfy/ldm/modules/encoders/noise_aug_modules.py by accepting
clip_stats_path (e.g., clip_stats_path=None) or by popping it from kwargs at the
start of the __init__ (before calling super().__init__), emit a deprecation
warning if clip_stats_path is provided, and then forward the remaining kwargs to
super().__init__ so existing callers continue to work while you transition to
timestep_dim.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 117fcd8a-33f6-4c28-831c-336bae0c5058

📥 Commits

Reviewing files that changed from the base of the PR and between 7d43768 and 92f2783.

📒 Files selected for processing (1)
  • comfy/ldm/modules/encoders/noise_aug_modules.py

Comment on lines +6 to 7
def __init__(self, *args, timestep_dim=256, **kwargs):
super().__init__(*args, **kwargs)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Constructor change introduces a backward-compatibility break for clip_stats_path callers.

Removing clip_stats_path from the signature means existing callers passing that kwarg now fail at runtime when **kwargs is forwarded to the parent constructor (which does not accept it). Please keep a compatibility shim and deprecate gracefully instead of hard-breaking.

Proposed compatibility-safe patch
-class CLIPEmbeddingNoiseAugmentation(ImageConcatWithNoiseAugmentation):
-    def __init__(self, *args, timestep_dim=256, **kwargs):
-        super().__init__(*args, **kwargs)
+class CLIPEmbeddingNoiseAugmentation(ImageConcatWithNoiseAugmentation):
+    def __init__(self, *args, clip_stats_path=None, timestep_dim=256, **kwargs):
+        # Backward-compat: accept legacy kwarg but no longer load external stats.
+        # (Optional: emit a deprecation warning here.)
+        super().__init__(*args, **kwargs)
         clip_mean, clip_std = torch.zeros(timestep_dim), torch.ones(timestep_dim)

As per coding guidelines "comfy/**: Core ML/diffusion engine. Focus on: Backward compatibility (breaking changes affect all custom nodes)".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@comfy/ldm/modules/encoders/noise_aug_modules.py` around lines 6 - 7, The
__init__ signature removed the clip_stats_path kwarg causing callers to break;
restore backward-compatible handling in the class __init__ in
comfy/ldm/modules/encoders/noise_aug_modules.py by accepting clip_stats_path
(e.g., clip_stats_path=None) or by popping it from kwargs at the start of the
__init__ (before calling super().__init__), emit a deprecation warning if
clip_stats_path is provided, and then forward the remaining kwargs to
super().__init__ so existing callers continue to work while you transition to
timestep_dim.

@comfyanonymous comfyanonymous merged commit 0c63b4f into master Apr 2, 2026
16 checks passed
@comfyanonymous comfyanonymous deleted the comfyanonymous-patch-2 branch April 2, 2026 00:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant