Conversation
📝 WalkthroughWalkthroughThe PR removes the 🚥 Pre-merge checks | ❌ 3❌ Failed checks (1 warning, 2 inconclusive)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@comfy/ldm/modules/encoders/noise_aug_modules.py`:
- Around line 6-7: The __init__ signature removed the clip_stats_path kwarg
causing callers to break; restore backward-compatible handling in the class
__init__ in comfy/ldm/modules/encoders/noise_aug_modules.py by accepting
clip_stats_path (e.g., clip_stats_path=None) or by popping it from kwargs at the
start of the __init__ (before calling super().__init__), emit a deprecation
warning if clip_stats_path is provided, and then forward the remaining kwargs to
super().__init__ so existing callers continue to work while you transition to
timestep_dim.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 117fcd8a-33f6-4c28-831c-336bae0c5058
📒 Files selected for processing (1)
comfy/ldm/modules/encoders/noise_aug_modules.py
| def __init__(self, *args, timestep_dim=256, **kwargs): | ||
| super().__init__(*args, **kwargs) |
There was a problem hiding this comment.
Constructor change introduces a backward-compatibility break for clip_stats_path callers.
Removing clip_stats_path from the signature means existing callers passing that kwarg now fail at runtime when **kwargs is forwarded to the parent constructor (which does not accept it). Please keep a compatibility shim and deprecate gracefully instead of hard-breaking.
Proposed compatibility-safe patch
-class CLIPEmbeddingNoiseAugmentation(ImageConcatWithNoiseAugmentation):
- def __init__(self, *args, timestep_dim=256, **kwargs):
- super().__init__(*args, **kwargs)
+class CLIPEmbeddingNoiseAugmentation(ImageConcatWithNoiseAugmentation):
+ def __init__(self, *args, clip_stats_path=None, timestep_dim=256, **kwargs):
+ # Backward-compat: accept legacy kwarg but no longer load external stats.
+ # (Optional: emit a deprecation warning here.)
+ super().__init__(*args, **kwargs)
clip_mean, clip_std = torch.zeros(timestep_dim), torch.ones(timestep_dim)As per coding guidelines "comfy/**: Core ML/diffusion engine. Focus on: Backward compatibility (breaking changes affect all custom nodes)".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@comfy/ldm/modules/encoders/noise_aug_modules.py` around lines 6 - 7, The
__init__ signature removed the clip_stats_path kwarg causing callers to break;
restore backward-compatible handling in the class __init__ in
comfy/ldm/modules/encoders/noise_aug_modules.py by accepting clip_stats_path
(e.g., clip_stats_path=None) or by popping it from kwargs at the start of the
__init__ (before calling super().__init__), emit a deprecation warning if
clip_stats_path is provided, and then forward the remaining kwargs to
super().__init__ so existing callers continue to work while you transition to
timestep_dim.
No description provided.