Provider-managed recurring schedules and admin finance tooling#397
Provider-managed recurring schedules and admin finance tooling#397Connorbelez merged 5 commits intomainfrom
Conversation
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 15 minutes and 51 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (60)
📝 WalkthroughWalkthroughIntroduces provider‑managed recurring collection schedules (Rotessa integration), centralizes auth policy/permission logic, extends schema for external schedules and execution modes, adds polling/ingestion/activation flows and admin payment/ledger UIs, and includes extensive tests, demos, and documentation. Changes
Sequence Diagram(s)sequenceDiagram
participant User as Admin
participant Frontend as Frontend UI
participant Convex as Convex Backend
participant RotessaAPI as Rotessa API
participant DB as Database
User->>Frontend: request activateRecurringSchedule(...)
Frontend->>Convex: activateRecurringSchedule(input)
Convex->>Convex: loadActivationSnapshot
Convex->>DB: validate mortgage, bank account, plan entries
Convex->>Convex: beginRecurringScheduleActivation (idempotency)
alt provider schedule required
Convex->>RotessaAPI: createTransactionSchedule(...)
RotessaAPI-->>Convex: externalScheduleRef,status
Convex->>Convex: recordRecurringScheduleProviderActivation
end
Convex->>Convex: commitRecurringScheduleActivation (bind entries)
Convex->>DB: patch plan entries & mortgage (provider_scheduled)
Convex-->>Frontend: return success
sequenceDiagram
participant Cron as Cron Scheduler
participant Poller as pollProviderManagedSchedules
participant DB as Database
participant Provider as Rotessa API
participant Ingest as ingestExternalOccurrenceEvent
Cron->>Poller: trigger (15m)
Poller->>DB: listSchedulesEligibleForPolling(asOf,limit)
loop per schedule
Poller->>DB: claimExternalCollectionScheduleSync(scheduleId,leaseOwner)
alt claimed
Poller->>Provider: pollOccurrenceUpdates(startDate,endDate,sinceCursor)
Provider-->>Poller: events[], nextCursor
loop each event
Poller->>Ingest: ingestExternalOccurrenceEvent(event)
Ingest->>DB: resolve planEntry (backlinks/ref/ordinal/date)
alt matched
Ingest->>DB: create/sync attempt & transfer, apply transitions
Ingest-->>Poller: materialized
else
Ingest->>DB: audit unresolved occurrence
Ingest-->>Poller: unresolved
end
end
Poller->>DB: recordExternalCollectionScheduleSyncSuccess(...)
else
Poller->>DB: recordExternalCollectionScheduleSyncFailure(...)
end
end
Poller-->>Cron: summary counts
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Possibly related issues
Possibly related PRs
Poem
✨ Finishing Touches🧪 Generate unit tests (beta)
|
There was a problem hiding this comment.
Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.
There was a problem hiding this comment.
Pull request overview
Note
Copilot was unable to run its full agentic suite in this review.
This PR centralizes WorkOS claim parsing + permission checks behind a shared RBAC policy module, and introduces provider-managed recurring collection schedule infrastructure (Rotessa) alongside new admin navigation entries and ledger/payment-ops UI utilities.
Changes:
- Added
auth-policyhelpers and migrated frontend + Convex fluent permission checks to “effective” permission logic. - Implemented provider-managed recurring schedules (schema, activation, poller, Rotessa adapters, webhook integration, and related tests/crons).
- Added admin navigation items and new shared financial-ledger/payment-ops UI/CSV/validation utilities + supporting docs/specs.
Reviewed changes
Copilot reviewed 66 out of 100 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| src/lib/auth.ts | Routes auth checks now delegate to auth-policy effective permission helpers. |
| src/lib/auth-policy.ts | New shared RBAC claim parsing/normalization + “effective” permission logic. |
| src/hooks/use-can-do.ts | Client permission hook now uses effective permission checks (org/roles-aware). |
| src/hooks/use-app-auth.ts | Normalizes roles/permissions and resolves primary role for app auth context. |
| src/components/document-engine/pdf-designer.tsx | Adds className prop and changes default height behavior. |
| src/components/demo/amps/hooks.ts | Demo access now uses shared staff-admin policy helper. |
| src/components/admin/shell/entity-registry.ts | Adds new static admin nav routes for payment ops + financial ledger. |
| src/components/admin/shell/AdminBreadcrumbs.tsx | Adds breadcrumb labels for the new admin routes. |
| src/components/admin/financial-ledger/validation.ts | New client-side CSV parsing + actual-vs-expected validation logic. |
| src/components/admin/financial-ledger/ui.tsx | New shared UI primitives for ledger/payment ops dashboards. |
| src/components/admin/financial-ledger/types.ts | New types for ledger/payment ops snapshots, tabs, and validation rows. |
| src/components/admin/financial-ledger/search.ts | Search-param parsing/cleaning for dashboard routes. |
| src/components/admin/financial-ledger/format.ts | Formatting helpers for currency/integers/dates/labels. |
| src/components/admin/financial-ledger/csv.ts | CSV parse/export helpers for accounting and validation flows. |
| src/components/admin/financial-ledger/actions.tsx | UI dialogs for waive/write-off actions via existing Convex mutations. |
| specs/15-support-provider-managed-recurring-collection-schedules/tasks.md | Task checklist for provider-managed recurring schedules workstream. |
| specs/15-support-provider-managed-recurring-collection-schedules/gap-analysis.md | Gap analysis for adapting current system to provider-managed schedules. |
| specs/15-support-provider-managed-recurring-collection-schedules/design.md | Detailed design for provider-managed recurring schedules architecture. |
| specs/15-support-provider-managed-recurring-collection-schedules/PRD.md | PRD capturing requirements + acceptance criteria for the feature. |
| reviews/04-11-provider_managed_payment_schedule/74b377d61.md | Captures prior review findings and prompts for follow-up. |
| package.json | Adds fast-check + @fast-check/vitest dev dependencies. |
| docs/technical-design/unified-payment-rails.md | Updates payment ops authorization section to align with new RBAC doc. |
| docs/superpowers/plans/2026-04-14-admin-financial-ledger-and-payment-operations.md | Implementation plan doc for the new admin dashboards. |
| docs/reviews/2026-04-08-security-rbac-review.md | Adds status note and points to canonical RBAC policy doc. |
| docs/cash-ledger-developer-guide.md | Updates RBAC notes to reference canonical RBAC doc and new permissions. |
| docs/architecture/unified-payment-rails-technical-design.md | Refreshes auth/RBAC notes and target permissions list. |
| docs/architecture/rbac-and-permissions.md | New canonical RBAC policy document. |
| convex/test/moduleMaps.ts | Registers new recurringSchedules / rotessa / demo modules for tests. |
| convex/schema.ts | Adds execution-mode fields, external schedules tables, indexes, and validators. |
| convex/payments/webhooks/rotessaPad.ts | Extends webhook handling with provider-managed schedule ingestion path. |
| convex/payments/webhooks/handleReversal.ts | Introduces PersistedTransferStatus alias usage in reversal handler. |
| convex/payments/transfers/types.ts | Adds PersistedTransferStatus (legacy "completed" tolerant). |
| convex/payments/transfers/providers/rotessa.ts | Adds Rotessa transfer provider (status only; initiate/cancel/confirm unsupported). |
| convex/payments/transfers/providers/registry.ts | Registers pad_rotessa provider in transfer provider registry. |
| convex/payments/rotessa/financialTransactions.ts | Adds normalization helpers from Rotessa transaction reports to local events/status. |
| convex/payments/rotessa/api.ts | Adds Rotessa API client with timeout + paging helpers. |
| convex/payments/recurringSchedules/validators.ts | New validators for execution modes/schedule statuses/occurrence events. |
| convex/payments/recurringSchedules/types.ts | New types for recurring schedule providers and normalized occurrence events. |
| convex/payments/recurringSchedules/queries.ts | Adds activation snapshot + schedule detail + polling eligibility queries. |
| convex/payments/recurringSchedules/providers/rotessaRecurring.ts | Rotessa schedule provider adapter + cursor handling for poller. |
| convex/payments/recurringSchedules/providers/registry.ts | Provider registry for recurring schedule providers. |
| convex/payments/recurringSchedules/poller.ts | Poller action + lease claiming + success/failure recording. |
| convex/payments/recurringSchedules/activation.ts | Activation action + two-phase schedule creation + plan entry patching. |
| convex/payments/collectionPlan/runner.ts | Adds optional mortgageId to runner to scope execution. |
| convex/payments/collectionPlan/queries.ts | Ensures due query filters to app-owned execution (incl legacy undefined). |
| convex/payments/collectionPlan/mutations.ts | Extends createEntry validator with provider-managed fields/status. |
| convex/payments/collectionPlan/initialScheduling.ts | Extends live covering statuses and create args for execution-mode linkage. |
| convex/payments/collectionPlan/executionContract.ts | Adds provider_scheduled to plan entry status union. |
| convex/payments/collectionPlan/execution.ts | Adjusts transfer creation call formatting + fixes source fields in transition audit. |
| convex/payments/collectionPlan/admin.ts | Allows provider_scheduled and refactors attempt query data fetching. |
| convex/payments/collectionPlan/tests/runner.test.ts | Adds coverage for skipping provider-managed entries + creation-time validation. |
| convex/fluent.ts | Moves claim normalization to auth-policy and uses effective permission checks. |
| convex/demo/tests/ampsExecutionModes.test.ts | New tests for app-owned vs provider-managed demo execution modes. |
| convex/demo/tests/ampsE2e.test.ts | Improves test env restore behavior for hashchain env flags. |
| convex/crons.ts | Adds provider-managed schedule polling cron. |
| CLAUDE.md | Updates auth standards to reference canonical RBAC doc + admin:access policy. |
| AGENTS.md | Updates auth standards to reference canonical RBAC doc + admin:access policy. |
| .husky/pre-commit | Makes hook NUL-safe for filenames and TanStack $ routes. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
@coderabbitai please review |
|
🧠 Learnings used✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 16
Note
Due to the large number of review comments, Critical, Major severity comments were prioritized as inline comments.
🟡 Minor comments (14)
src/components/document-engine/pdf-designer.tsx-138-138 (1)
138-138:⚠️ Potential issue | 🟡 MinorHandle empty
classNameso default height is preserved.At Line 138,
className ?? "h-[700px]"treats""as a valid value, which can remove the height and collapse the designer container.Suggested fix
- className={`overflow-hidden rounded-md border ${className ?? "h-[700px]"}`} + className={`overflow-hidden rounded-md border ${className?.trim() ? className : "h-[700px]"}`}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/document-engine/pdf-designer.tsx` at line 138, The JSX currently uses `${className ?? "h-[700px]"}` which treats an empty string as a valid value and drops the default height; change that conditional to treat empty/whitespace-only strings as missing (e.g., replace the nullish coalescing with a check like `className && className.trim() ? className : "h-[700px]"`) so the default "h-[700px]" is applied when className is empty or blank; update the expression where className is interpolated in the component (the template containing `overflow-hidden rounded-md border ${className ?? "h-[700px]"}`).convex/demo/__tests__/ampsE2e.test.ts-27-49 (1)
27-49:⚠️ Potential issue | 🟡 MinorEnter the cleanup scope before
createHarness().
createHarness()mutatesprocess.envbefore thetry/finallystarts. If harness construction throws,restoreEnv()never runs; in the timer-based tests,vi.useFakeTimers()is also outside cleanup and can leak mocked time into later cases.For the fake-timer tests, move
vi.useFakeTimers()into that sametryblock just abovecreateHarness().♻️ Cleanup pattern to apply
- const { restoreEnv, t } = createHarness(); - const admin = t.withIdentity(ADMIN_IDENTITY); - - try { + let restoreEnv = () => {}; + try { + const harness = createHarness(); + restoreEnv = harness.restoreEnv; + const admin = harness.t.withIdentity(ADMIN_IDENTITY); // ... } finally { restoreEnv(); }Also applies to: 54-55, 79-81, 148-150, 191-193
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/demo/__tests__/ampsE2e.test.ts` around lines 27 - 49, The test mutates process.env (and calls vi.useFakeTimers() for timer tests) before the try/finally cleanup is registered, so if createHarness() throws the environment is never restored; move any process.env modifications and vi.useFakeTimers() calls into the cleanup scope so restoreEnv() always runs. Concretely, ensure createHarness() does not unconditionally mutate globals: either (A) change createHarness to only return a pure harness object and provide a separate setup function for environment toggles, or (B) at each test site call vi.useFakeTimers() and set DISABLE_GT_HASHCHAIN/DISABLE_CASH_LEDGER_HASHCHAIN inside the try block immediately before invoking createHarness(), then call t.restoreEnv() in the finally block; update occurrences referencing createHarness, restoreEnv, and vi.useFakeTimers at the other locations noted (lines ~54-55, 79-81, 148-150, 191-193).convex/demo/__tests__/ampsExecutionModes.test.ts-43-55 (1)
43-55:⚠️ Potential issue | 🟡 MinorEnvironment variable cleanup may not fully restore original state.
Setting
process.env.DISABLE_GT_HASHCHAIN = undefineddoesn't remove the key fromprocess.env; it sets it to the string"undefined". Usedeleteto properly restore when the original was unset.🐛 Proposed fix for proper env restoration
restoreEnv() { if (previousGtHashchain === undefined) { - process.env.DISABLE_GT_HASHCHAIN = undefined; + delete process.env.DISABLE_GT_HASHCHAIN; } else { process.env.DISABLE_GT_HASHCHAIN = previousGtHashchain; } if (previousCashLedgerHashchain === undefined) { - process.env.DISABLE_CASH_LEDGER_HASHCHAIN = undefined; + delete process.env.DISABLE_CASH_LEDGER_HASHCHAIN; } else { process.env.DISABLE_CASH_LEDGER_HASHCHAIN = previousCashLedgerHashchain; } },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/demo/__tests__/ampsExecutionModes.test.ts` around lines 43 - 55, The restoreEnv function currently assigns process.env.DISABLE_GT_HASHCHAIN = undefined and DISABLE_CASH_LEDGER_HASHCHAIN = undefined which sets the string "undefined" instead of removing the env keys; update restoreEnv (in the test file where restoreEnv is defined) to use delete process.env.DISABLE_GT_HASHCHAIN and delete process.env.DISABLE_CASH_LEDGER_HASHCHAIN when previousGtHashchain or previousCashLedgerHashchain are undefined, otherwise set them back to the saved previousGtHashchain and previousCashLedgerHashchain values to fully restore the original environment.convex/payments/recurringSchedules/occurrenceIngestion.ts-287-291 (1)
287-291:⚠️ Potential issue | 🟡 MinorPotential stale data return after patching existing attempt.
When an existing attempt is found via
collectionAttemptId(lines 274-292), the function patches the DB but then returns a manually spread object that may not reflect the complete patched state. Consider re-fetching the document after the patch for consistency with the behavior elsewhere in this file.🐛 Proposed fix to refetch after patch
if (existingAttempt) { await args.ctx.db.patch( existingAttempt._id, buildCollectionAttemptProviderMirrorPatch({ currentReportedAt: existingAttempt.providerLastReportedAt, event: args.event, providerOccurrenceKey: occurrenceKey, }) ); - return { - ...existingAttempt, - providerOccurrenceKey: occurrenceKey, - } as CollectionAttemptDoc; + const refreshed = await args.ctx.db.get(existingAttempt._id); + if (!refreshed) { + throw new ConvexError( + `Collection attempt ${existingAttempt._id} disappeared while syncing provider mirror state.` + ); + } + return refreshed; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/occurrenceIngestion.ts` around lines 287 - 291, The code patches an existing attempt (found via collectionAttemptId / existingAttempt) and then returns a shallow-spread of existingAttempt with providerOccurrenceKey, which can be stale; after performing the DB patch/update call, re-fetch the up-to-date document (e.g., via the same query used elsewhere in this file or a helper like getCollectionAttemptById) and return that fresh CollectionAttemptDoc instead of the manually spread object so the caller always receives the fully patched state.convex/payments/recurringSchedules/queries.ts-260-265 (1)
260-265:⚠️ Potential issue | 🟡 MinorPotential duplicate fetching in
extraSyncErrorcollection.When
extraSyncErroris collected, it callscollectEligibleSchedules("sync_error", limit - active.length)which may return items already inreservedSyncError. The.slice(reservedSyncError.length)assumes deterministic ordering, but pagination cursors reset between calls. Consider tracking already-seen IDs instead.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/queries.ts` around lines 260 - 265, The extraSyncError computation may re-fetch items already present in reservedSyncError because it calls collectEligibleSchedules("sync_error", limit - active.length) and then slices by reservedSyncError.length which assumes stable ordering; instead deduplicate by tracking seen IDs: collect the results of collectEligibleSchedules, filter out any schedule IDs present in reservedSyncError (or a Set built from reservedSyncError.map(s=>s.id)), and then take up to (limit - active.length) from the filtered list before assigning to extraSyncError; reference variables/functions: extraSyncError, collectEligibleSchedules, reservedSyncError, active, limit.convex/demo/ampsExecutionModes.ts-1481-1488 (1)
1481-1488:⚠️ Potential issue | 🟡 MinorDon't store occurrence outcomes in
lastProviderScheduleStatus.Passing
Approved/Declinedhere overwrites the schedule's lifecycle state with a single occurrence result. After the first poller advance, the workspace will report the schedule itself as approved/declined instead of still active.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/demo/ampsExecutionModes.ts` around lines 1481 - 1488, The code is incorrectly writing an occurrence result into the schedule lifecycle by passing finalStatus into the lastProviderScheduleStatus field when calling internal.demo.ampsExecutionModes.patchDemoScheduleSyncMirrorInternal (inside the channel === "poller" branch); remove or stop setting lastProviderScheduleStatus with finalStatus and either omit that property entirely or write the occurrence result to a separate field (e.g., lastOccurrenceOutcome) so the schedule's lifecycle state is not overwritten—update the call site to pass only valid schedule lifecycle data (or no lastProviderScheduleStatus) and, if needed, add/use a distinct field for occurrence outcomes instead.convex/payments/recurringSchedules/activation.ts-215-220 (1)
215-220:⚠️ Potential issue | 🟡 MinorReturn
needsProviderCreatehere too.
activateRecurringSchedulereadsneedsProviderCreate, but this branch returnscreated. It works today only becauseundefinedis falsey.🩹 Suggested fix
- return { created: false as const, scheduleId: activeSchedule._id }; + return { + needsProviderCreate: false as const, + scheduleId: activeSchedule._id, + };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/activation.ts` around lines 215 - 220, The early-return inside activateRecurringSchedule that checks activeSchedule.activationIdempotencyKey returns only { created: false, scheduleId } but omits needsProviderCreate; update that return to include needsProviderCreate (set to false) so callers that read needsProviderCreate get a defined boolean instead of relying on undefined/falsy values (locate the branch that compares activeSchedule.activationIdempotencyKey to args.activationIdempotencyKey and return { created: false, scheduleId: activeSchedule._id, needsProviderCreate: false }).convex/demo/ampsExecutionModes.ts-377-380 (1)
377-380:⚠️ Potential issue | 🟡 MinorFully clear execution metadata when decommissioning a workspace.
This only nulls
activeExternalCollectionScheduleId. The abandoned demo mortgage keepscollectionExecutionMode/collectionExecutionProviderCode, so it still appears as an active app/provider-managed mortgage in mode-based queries after reseeds.🧹 Suggested fix
await ctx.db.patch(workspace.mortgageId, { + collectionExecutionMode: undefined, + collectionExecutionProviderCode: undefined, activeExternalCollectionScheduleId: undefined, collectionExecutionUpdatedAt: decommissionedAt, });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/demo/ampsExecutionModes.ts` around lines 377 - 380, The patch that decommissions a workspace (ctx.db.patch on workspace.mortgageId) only clears activeExternalCollectionScheduleId and sets collectionExecutionUpdatedAt; update that patch to also clear collectionExecutionMode and collectionExecutionProviderCode (set them to undefined alongside activeExternalCollectionScheduleId) so the demo mortgage is fully cleared and won't show up in mode/provider-based queries after reseeds; keep collectionExecutionUpdatedAt set to decommissionedAt as-is.AGENTS.md-53-55 (1)
53-55:⚠️ Potential issue | 🟡 MinorAlign the sample admin JWT with this new rule.
These bullets say an admin token only carries
admin:access, but the sample payload later in the same section still showswidgets:users-table:manage. That leaves the doc teaching two different permission models. Please update the example or mark it as a non-admin sample.Based on learnings:
admingets exactly one WorkOS permission:admin:access.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@AGENTS.md` around lines 53 - 55, Update the sample JWT payload in AGENTS.md so it matches the new RBAC rule that admin tokens only include the single WorkOS permission "admin:access": find the example payload that currently contains "widgets:users-table:manage" and either replace that permission with "admin:access" for an admin example or explicitly label the payload as a non-admin/example user token; ensure any wording referencing admin permissions (e.g., "admin:access" and function/guard names like requireFairLendAdmin) is consistent with the change.src/components/admin/financial-ledger/payment-operations-page.tsx-262-264 (1)
262-264:⚠️ Potential issue | 🟡 MinorDo not collapse missing reconciliation into
unhealthy.Both the CSV export and the table badge treat
row.reconciliation === undefinedasunhealthy, but the rest of this component distinguishes missing reconciliation data from an actual failed reconciliation. That will create false exceptions in the UI and exports.Also applies to: 728-731
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/payment-operations-page.tsx` around lines 262 - 264, The code incorrectly collapses missing reconciliation into "unhealthy" by using a truthy check; change the mapping so it distinguishes three states: explicitly check row.reconciliation === undefined and set reconciliation_status to "missing" (or "unknown"), otherwise set "healthy" when row.reconciliation.isHealthy is true and "unhealthy" when false; apply the same explicit check in the other occurrence (lines referenced around 728-731) and update any CSV export and table badge logic that reads reconciliation_status to handle the new "missing" value.docs/architecture/state-machines.md-684-689 (1)
684-689:⚠️ Potential issue | 🟡 MinorMark
hasCompleteDataas a demo-only guard pattern.This explains where the current demo guard reads from, but it drops the important POC caveat that production flows may need to validate against event payload data instead. Without that note, this guide reads like
context.datais the recommended production pattern.Based on learnings, "the
hasCompleteDataXState guard inconvex/demo/machines/loanApplication.machine.tsis explicitly a POC/demo pattern ... In production, the guard may need to read fromeventinstead."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/architecture/state-machines.md` around lines 684 - 689, Update the docs to mark the XState guard hasCompleteData as a demo/POC-only pattern: explicitly state that hasCompleteData (from convex/demo/machines/loanApplication.machine.ts) reads from context.data.applicantName and context.data.loanAmount and that production flows may instead need to validate against the incoming event payload (i.e., read from event rather than context), and add a short note advising implementers to adapt the guard to their event-driven validation needs.docs/architecture/rbac-and-permissions.md-160-164 (1)
160-164:⚠️ Potential issue | 🟡 MinorReplace local checkout paths with repo-relative links.
These references point at
/Users/connor/..., so they are broken for everyone except your machine and leak a workstation-specific path into a shared doc. Use links relative todocs/architecture/instead.📝 Suggested link fix
- - Runtime role matrix today: [src/test/auth/permissions.ts](/Users/connor/Dev/tanstackFairLend/fairlendapp/src/test/auth/permissions.ts) - - Frontend route auth helpers: [src/lib/auth.ts](/Users/connor/Dev/tanstackFairLend/fairlendapp/src/lib/auth.ts) - - Backend auth chains: [convex/fluent.ts](/Users/connor/Dev/tanstackFairLend/fairlendapp/convex/fluent.ts) + - Runtime role matrix today: [src/test/auth/permissions.ts](../../src/test/auth/permissions.ts) + - Frontend route auth helpers: [src/lib/auth.ts](../../src/lib/auth.ts) + - Backend auth chains: [convex/fluent.ts](../../convex/fluent.ts)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/architecture/rbac-and-permissions.md` around lines 160 - 164, Replace the workstation-specific absolute paths in the "Canonical References" list with repo-relative links (relative to docs/architecture/) so the links work for everyone; update the three entries referencing src/test/auth/permissions.ts (Runtime role matrix), src/lib/auth.ts (Frontend route auth helpers), and convex/fluent.ts (Backend auth chains) to use repository-relative paths instead of the /Users/connor/... absolute paths.src/components/admin/financial-ledger/financial-ledger-page.tsx-1938-1975 (1)
1938-1975:⚠️ Potential issue | 🟡 MinorSilent no-op for ownership-ledger tab export.
When the "Export CSV" button is clicked on the ownership-ledger tab, nothing happens since there's no handler for it. Consider either disabling the button for that tab or adding a basic export.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/financial-ledger-page.tsx` around lines 1938 - 1975, The Export CSV Button's onClick covers several search.tab values but omits "ownership-ledger", causing a silent no-op; either add a branch for search.tab === "ownership-ledger" that calls downloadCsv with an appropriate CSV builder (e.g., downloadCsv("ownership-ledger.csv", buildOwnershipLedgerCsv(snapshot or relevant data)))—implement or reuse a builder function (e.g., buildOwnershipLedgerCsv) to produce CSV rows—or disable the Button when search.tab === "ownership-ledger" (add disabled={search.tab === "ownership-ledger"}) and optionally show a tooltip; locate the Button onClick block and the downloadCsv and builder functions (buildChartOfAccountsCsv, buildJournalLinesCsv, buildTrialBalanceCsv, buildExceptionCsv, buildOpsIncidentsCsv, buildValidationDiffCsv) to implement the chosen fix.src/components/admin/financial-ledger/financial-ledger-page.tsx-795-804 (1)
795-804:⚠️ Potential issue | 🟡 MinorMissing error handling for file read.
If
file.text()fails (e.g., permission denied, file corrupted), the promise rejection is unhandled. Consider wrapping in try-catch and providing user feedback.🛡️ Suggested fix
async function handleValidationUpload(event: ChangeEvent<HTMLInputElement>) { const file = event.target.files?.[0]; if (!file) { return; } - const text = await file.text(); - const rows = parseValidationCsv(text); - setExpectedRows(rows); - setActiveValidationRows(rows); + try { + const text = await file.text(); + const rows = parseValidationCsv(text); + setExpectedRows(rows); + setActiveValidationRows(rows); + } catch { + // Consider adding toast/alert notification here + console.error("Failed to read validation file"); + } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/financial-ledger-page.tsx` around lines 795 - 804, The file-read in handleValidationUpload currently awaits file.text() without error handling; wrap the async file.text() and subsequent parseValidationCsv(...) / setExpectedRows(...) / setActiveValidationRows(...) calls in a try-catch, handle Promise rejections, and in the catch provide user feedback (e.g., show a toast/error state) and avoid calling parseValidationCsv or the setters on failure; keep the rest of the function unchanged and reference handleValidationUpload, parseValidationCsv, setExpectedRows, and setActiveValidationRows when applying the fix.
🧹 Nitpick comments (18)
convex/payments/collectionPlan/admin.ts (1)
419-432: Restore parallel fetches ingetCollectionAttemptto avoid avoidable latency.These three reads are independent and were previously safe to run concurrently.
⚡ Suggested refactor
- const planEntry = await ctx.db.get(attempt.planEntryId); - const auditEvents = await auditLog.queryByResource(ctx, { - resourceType: "collectionAttempts", - resourceId: `${args.attemptId}`, - limit: 25, - }); - const transitionJournal = await ctx.db - .query("auditJournal") - .withIndex("by_entity", (q) => - q - .eq("entityType", "collectionAttempt") - .eq("entityId", `${args.attemptId}`) - ) - .collect(); + const [planEntry, auditEvents, transitionJournal] = await Promise.all([ + ctx.db.get(attempt.planEntryId), + auditLog.queryByResource(ctx, { + resourceType: "collectionAttempts", + resourceId: `${args.attemptId}`, + limit: 25, + }), + ctx.db + .query("auditJournal") + .withIndex("by_entity", (q) => + q + .eq("entityType", "collectionAttempt") + .eq("entityId", `${args.attemptId}`) + ) + .collect(), + ]);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/collectionPlan/admin.ts` around lines 419 - 432, The three independent reads (ctx.db.get(attempt.planEntryId), auditLog.queryByResource(ctx, {...}), and ctx.db.query("auditJournal")...collect()) are run sequentially causing extra latency; change getCollectionAttempt to start all three as promises and await them concurrently with Promise.all (e.g., create planEntryP, auditEventsP, transitionJournalP then const [planEntry, auditEvents, transitionJournal] = await Promise.all([...])) so the logic using planEntry, auditEvents, and transitionJournal remains the same but the fetches run in parallel; preserve existing query parameters and any error handling around these calls.convex/payments/rotessa/api.ts (1)
219-243: Unbounded pagination loop with magic number for page size.The
findTransactionReportRowmethod loops indefinitely until a match is found orrows.length < 1000. Consider:
- Extracting 1000 as a named constant (e.g.,
ROTESSA_PAGE_SIZE)- Adding a maximum page limit to prevent infinite loops if the API misbehaves
♻️ Proposed safeguard for pagination
+const ROTESSA_PAGE_SIZE = 1000; +const MAX_TRANSACTION_REPORT_PAGES = 100; + async findTransactionReportRow(args: { endDate: string; providerRef: string; startDate: string; }) { - for (let page = 1; ; page += 1) { + for (let page = 1; page <= MAX_TRANSACTION_REPORT_PAGES; page += 1) { const rows = await this.getTransactionReport({ startDate: args.startDate, endDate: args.endDate, status: "All", page, }); const matched = rows.find( (row) => row.transaction_number === args.providerRef || String(row.id) === args.providerRef ); if (matched) { return matched; } - if (rows.length < 1000) { + if (rows.length < ROTESSA_PAGE_SIZE) { return null; } } + return null; // Max pages reached without finding match }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/rotessa/api.ts` around lines 219 - 243, The pagination loop in findTransactionReportRow uses a magic number (1000) and can run indefinitely; extract that value into a named constant (e.g., ROTESSA_PAGE_SIZE) and replace the literal with it, and add a maximum page guard (e.g., MAX_TRANSACTION_PAGES) to the loop that stops and returns null (or throws a clear error) if page > MAX_TRANSACTION_PAGES to protect against misbehaving getTransactionReport responses; update any related calls or tests to reference the new constants.convex/payments/transfers/providers/rotessa.ts (1)
31-50: Consider making the transaction report date window configurable.The hardcoded 14-day lookback and 3-day lookahead window works for typical scenarios, but edge cases (delayed processing, long holidays) could cause
getStatusto miss transactions. Consider extracting these as configurable parameters or constants for easier adjustment.♻️ Optional: Extract date window as configurable constants
+const TRANSACTION_REPORT_LOOKBACK_DAYS = 14; +const TRANSACTION_REPORT_LOOKAHEAD_DAYS = 3; + export class RotessaTransferProvider implements TransferProvider { // ... async getStatus(providerRef: string): Promise<StatusResult> { const today = new Date(); const startDate = new Date(today); - startDate.setDate(startDate.getDate() - 14); + startDate.setDate(startDate.getDate() - TRANSACTION_REPORT_LOOKBACK_DAYS); const endDate = new Date(today); - endDate.setDate(endDate.getDate() + 3); + endDate.setDate(endDate.getDate() + TRANSACTION_REPORT_LOOKAHEAD_DAYS);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/transfers/providers/rotessa.ts` around lines 31 - 50, The getStatus method currently uses hardcoded 14-day lookback and 3-day lookahead when building startDate/endDate, which should be configurable; modify the Rotessa provider to replace the literals with configurable parameters (e.g., lookbackDays and lookaheadDays) exposed either as constructor options or class-level constants/defaults, read those values when computing startDate and endDate before calling this.apiClient.findTransactionReportRow, and keep existing behavior by defaulting to 14 and 3 if no override is provided so buildRotessaTransferStatusResult and findTransactionReportRow continue to work unchanged.convex/payments/recurringSchedules/occurrenceIngestion.ts (1)
294-334: Two-step insert pattern formachineContext.attemptIdcould be consolidated.The collection attempt is inserted with an empty
attemptIdstring (line 297), then immediately patched with the actual ID (lines 326-332). This works but adds an extra DB round-trip.♻️ Optional: Consider generating ID upfront if Convex supports it
If Convex allows passing a known ID to
insert, this could be a single operation. Otherwise, this pattern is acceptable given the machine context requirement to have the attempt ID embedded.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/occurrenceIngestion.ts` around lines 294 - 334, The insert currently writes collectionAttempts with machineContext.attemptId set to "" then immediately patches it; to avoid the extra DB round-trip, generate the attempt id up-front (or use Convex's ability to pass a client-generated id to insert) and pass that id into args.ctx.db.insert so machineContext.attemptId is correct on first write; update references to attemptId, collectionAttempts, args.ctx.db.insert and remove the subsequent args.ctx.db.patch call (or keep it conditional if Convex does not support client ids).convex/payments/recurringSchedules/validators.ts (1)
36-37: VerifyproviderCodeextensibility andv.any()usage.The
providerCodeis currently hardcoded to"pad_rotessa"only, which aligns with the PRD's scope ("Providers beyond Rotessa in the initial delivery... out of scope"). However, when adding future providers, this validator will need extension.The
v.any()inproviderDatais reasonable for opaque provider-specific metadata, though it bypasses type safety. Consider documenting expected shapes per provider as a follow-up.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/validators.ts` around lines 36 - 37, The validator currently locks providerCode to the single literal "pad_rotessa" and uses v.any() for providerData which reduces future extensibility and type safety; update providerCode (the symbol providerCode) to allow either a broader string or a union of provider literals so new providers can be added without changing the shape everywhere, and add a clear TODO comment near providerCode noting the need to expand to specific provider literals when new providers are introduced; for providerData (the symbol providerData) keep the optional record but replace or annotate the use of v.any() with a comment referencing expected provider-specific shapes and consider swapping v.any() for a more explicit fallback (e.g., an unknown/opaque type) later and add a short note to document provider-specific schemas as follow-up work.convex/payments/rotessa/financialTransactions.ts (1)
44-61: Consider logging or throwing on unmapped statuses in transfer status mapping.The default case silently returns
"pending"for any unrecognized status, which could mask unexpected provider states. The event mapping function (lines 24-42) correctly returnsundefinedfor unmapped statuses, but this function does not.♻️ Proposed fix for consistency
export function mapRotessaFinancialStatusToTransferStatus( status: RotessaTransactionReportRow["status"] ): StatusResult["status"] { switch (status) { case "Future": return "pending"; case "Pending": return "processing"; case "Approved": return "confirmed"; case "Declined": return "failed"; case "Chargeback": return "reversed"; default: + console.warn(`[Rotessa] Unmapped financial status: "${status}", defaulting to "pending"`); return "pending"; } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/rotessa/financialTransactions.ts` around lines 44 - 61, The mapRotessaFinancialStatusToTransferStatus function currently returns "pending" for any unknown RotessaTransactionReportRow["status"], which can hide unexpected provider states; change it to mirror the event mapper by returning undefined (or alternatively throw/log an error) for unmapped statuses and add a descriptive log entry when an unknown status is encountered so unexpected values are surfaced; update mapRotessaFinancialStatusToTransferStatus and ensure callers handle undefined (or handle the thrown error) accordingly.convex/payments/recurringSchedules/providers/rotessaRecurring.ts (2)
181-183: Use more efficient approach than.reverse().find().Creating a reversed copy just to find the last element matching a condition is inefficient. Consider iterating backwards or using
Array.prototype.findLast()if available in your target runtime.♻️ Proposed optimization
- const lastEmittedEvent = [...emittedEvents] - .reverse() - .find((occurrenceEvent) => occurrenceEvent.occurredAt !== undefined); + const lastEmittedEvent = emittedEvents.findLast( + (occurrenceEvent) => occurrenceEvent.occurredAt !== undefined + );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/providers/rotessaRecurring.ts` around lines 181 - 183, The code creates lastEmittedEvent by calling [...emittedEvents].reverse().find(...), which allocates a reversed copy unnecessarily; change this to iterate from the end or use Array.prototype.findLast() if your runtime supports it. Update the logic that assigns lastEmittedEvent (the variable computed from emittedEvents) to either call emittedEvents.findLast(occurrenceEvent => occurrenceEvent.occurredAt !== undefined) or implement a simple backward for-loop over emittedEvents (for (let i = emittedEvents.length - 1; i >= 0; --i) ...) and return the first occurrenceEvent with occurredAt !== undefined to avoid copying the array.
124-168: Add safety guard and extract magic number for pagination.The
while (true)loop relies onrows.length < 1000to break, but this assumes a specific page size from the Rotessa API. If the API behavior changes or returns exactly 1000 rows indefinitely, this could loop excessively.♻️ Proposed fix to add constants and safety limit
+const ROTESSA_REPORT_PAGE_SIZE = 1000; +const MAX_POLL_PAGES = 50; + // In pollOccurrenceUpdates: let page = 1; let pagesVisited = 0; const events: NormalizedExternalCollectionOccurrenceEvent[] = []; const sinceCursor = parseOccurrenceCursor(args.sinceCursor); - while (true) { + while (pagesVisited < MAX_POLL_PAGES) { const rows = await this.apiClient.getTransactionReport({ startDate: args.startDate, endDate: args.endDate, status: "All", page, }); pagesVisited += 1; // ... filtering logic ... - if (rows.length < 1000) { + if (rows.length < ROTESSA_REPORT_PAGE_SIZE) { break; } page += 1; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/providers/rotessaRecurring.ts` around lines 124 - 168, The polling loop uses while(true) and a magic number 1000 to detect the last page which can cause infinite loops if the API page size changes; introduce a PAGE_SIZE constant (used instead of 1000) and a MAX_PAGES or MAX_ITERATIONS safety limit, replace while(true) with a bounded loop (or check page >= MAX_PAGES) around the calls to apiClient.getTransactionReport, and if the max is reached log/throw an error or break; keep existing filtering logic (buildNormalizedOccurrenceFromRotessaRow, buildOccurrenceSortKey, sinceCursor) and push into events as before but use PAGE_SIZE to decide when to stop and enforce the MAX_PAGES guard to prevent unbounded paging.convex/payments/recurringSchedules/poller.ts (2)
135-139: Consider logging when lease owner doesn't match.Both
recordExternalCollectionScheduleSyncSuccessandrecordExternalCollectionScheduleSyncFailuresilently return when the lease owner doesn't match. While this prevents stale updates, silent returns could mask debugging issues. Consider adding a log or returning a status indicator.♻️ Proposed fix to add logging
.handler(async (ctx, args) => { const schedule = await ctx.db.get(args.scheduleId); if (!schedule || schedule.syncLeaseOwner !== args.leaseOwner) { + if (schedule && schedule.syncLeaseOwner !== args.leaseOwner) { + console.info( + `[Poller] Lease owner mismatch for schedule ${args.scheduleId}: expected ${args.leaseOwner}, found ${schedule.syncLeaseOwner}` + ); + } return; }Also applies to: 176-180
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/poller.ts` around lines 135 - 139, The functions recordExternalCollectionScheduleSyncSuccess and recordExternalCollectionScheduleSyncFailure currently return silently when the fetched schedule is missing or schedule.syncLeaseOwner !== args.leaseOwner; modify both handlers to log an informational/warning message in that branch including the scheduleId, provided leaseOwner (args.leaseOwner) and the actual schedule.syncLeaseOwner (or "not found") so stale-lease cases are visible in logs — update the .handler(async (ctx, args) => { ... }) blocks to call the logger on the early-return path (use the existing ctx.logger or equivalent) and keep the early return behavior after logging.
216-226: Consider importing function references directly instead of casting.The
makeFunctionReference(...) as unknown as FunctionReference<"mutation">pattern is error-prone. Consider importing the mutations directly or using the generatedinternalAPI object for type-safe references.♻️ Proposed approach using internal API
+import { internal } from "../../_generated/api"; // Then in the handler: - const claimed = (await ctx.runMutation( - makeFunctionReference( - "payments/recurringSchedules/poller:claimExternalCollectionScheduleSync" - ) as unknown as FunctionReference<"mutation">, - { - asOf, - leaseOwner, - leaseTtlMs: POLL_LEASE_MS, - scheduleId: candidate._id, - } - )) as { claimed: boolean }; + const claimed = await ctx.runMutation( + internal.payments.recurringSchedules.poller.claimExternalCollectionScheduleSync, + { + asOf, + leaseOwner, + leaseTtlMs: POLL_LEASE_MS, + scheduleId: candidate._id, + } + );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/poller.ts` around lines 216 - 226, Replace the unsafe cast of makeFunctionReference in the ctx.runMutation call by using a type-safe reference: import or use the generated internal API's function reference for the "claimExternalCollectionScheduleSync" mutation instead of makeFunctionReference(... ) as unknown as FunctionReference<"mutation">; update the call site that currently uses makeFunctionReference("payments/recurringSchedules/poller:claimExternalCollectionScheduleSync") so it passes the imported/generated function reference (or internal.<module>.<mutation>) directly and remove the cast to ensure compile-time type safety for ctx.runMutation and the claimed result.convex/payments/recurringSchedules/queries.ts (2)
220-248: Add a safety limit to prevent unbounded pagination loops.The
while (eligible.length < targetCount)loop relies onisDoneto terminate, but if the provider returns pages indefinitely with all items filtered out (e.g., all leased), this could loop excessively.🛡️ Proposed fix to add max iterations guard
const collectEligibleSchedules = async ( status: "active" | "sync_error", targetCount: number ) => { const eligible: Doc<"externalCollectionSchedules">[] = []; let cursor: string | null = null; const pageSize = Math.max(25, Math.min(targetCount * 3, 100)); + const maxIterations = 20; + let iterations = 0; - while (eligible.length < targetCount) { + while (eligible.length < targetCount && iterations < maxIterations) { + iterations += 1; const { continueCursor, isDone, page } = await ctx.db🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/queries.ts` around lines 220 - 248, The loop that paginates externalCollectionSchedules (while (eligible.length < targetCount)) can spin indefinitely if pages keep returning filtered-out items; add a max-iteration guard (e.g., const MAX_PAGES = 100 or a passed-in maxPages) and a counter (iterations++ each loop) inside the same scope as eligible/cursor; after incrementing, if iterations >= MAX_PAGES return the current eligible (or throw/log a clear error) to break out safely; ensure the change is applied around the paginate call and references the same cursor/continueCursor logic so behavior is unchanged when the provider is well-behaved.
49-51: Consider makingasOfrequired or documenting determinism implications.Using
Date.now()as a default inside a query handler can cause non-deterministic behavior across retries or when the query is cached. The caller should ideally always provideasOfexplicitly for consistent results. The current implementation works but relies on callers understanding this subtlety.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/queries.ts` around lines 49 - 51, The handler currently uses a non-deterministic default (const asOf = args.asOf ?? Date.now()) which can cause inconsistent query results; change this so callers must provide asOf explicitly by removing the Date.now() fallback, update the args type (e.g., the RecurringSchedules query arg/interface) to make asOf required, add a defensive check in the .handler(async (ctx, args) => { ... }) that throws a clear error if args.asOf is missing, and then update all call sites and any docs/comments near this query to reflect that asOf is mandatory (or alternatively, if you keep it optional, add a clear comment/docblock on the handler explaining the determinism implications and when Date.now() will be used).convex/payments/webhooks/rotessaPad.ts (2)
136-156:shouldAttemptProviderManagedOccurrenceIngestionhas redundant conditions.The function checks specific Rotessa statuses AND normalized event types, but if the event type maps to a normalized type, the second check is always true. Consider simplifying.
♻️ Proposed simplification
function shouldAttemptProviderManagedOccurrenceIngestion( eventType: string, - normalizedEventType: NormalizedTransferWebhookEventType + _normalizedEventType: NormalizedTransferWebhookEventType ) { + // Attempt provider-managed ingestion for raw Rotessa lifecycle statuses + // or transaction-style events that might be from recurring schedules if ( eventType === "Future" || eventType === "Pending" || eventType === "Approved" || eventType === "Declined" || - eventType === "Chargeback" + eventType === "Chargeback" || + eventType.startsWith("transaction.") ) { return true; } - - return ( - normalizedEventType === "PROCESSING_UPDATE" || - normalizedEventType === "FUNDS_SETTLED" || - normalizedEventType === "TRANSFER_FAILED" || - normalizedEventType === "TRANSFER_REVERSED" - ); + return false; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/webhooks/rotessaPad.ts` around lines 136 - 156, The function shouldAttemptProviderManagedOccurrenceIngestion has redundant raw-status checks because those raw eventType values already map to the normalizedEventType values checked later; fix by removing the explicit eventType === "Future" | "Pending" | "Approved" | "Declined" | "Chargeback" checks and rely on normalizedEventType membership instead, or if normalizedEventType may be missing, first compute it from eventType (e.g. call your normalize function) and then return whether normalizedEventType is one of "PROCESSING_UPDATE", "FUNDS_SETTLED", "TRANSFER_FAILED", or "TRANSFER_REVERSED"; update the function signature/logic around shouldAttemptProviderManagedOccurrenceIngestion(eventType, normalizedEventType) accordingly.
375-393: External API call in webhook processing adds latency and failure points.The
findRotessaFinancialTransactionForWebhookfunction makes a synchronous API call to Rotessa during webhook processing. While the error is caught and falls back to direct transfer processing (line 580-585), this adds latency to every provider-managed webhook. Consider whether this lookup could be cached or deferred.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/webhooks/rotessaPad.ts` around lines 375 - 393, The webhook handler currently calls findRotessaFinancialTransactionForWebhook which synchronously invokes RotessaApiClient.findTransactionReportRow and adds latency/failure risk; change the flow so the webhook responds immediately (200) and the Rotessa lookup is performed asynchronously: either (A) add a cache layer inside findRotessaFinancialTransactionForWebhook (e.g., Redis/memo keyed by providerRef+date with a short TTL) to return cached results when available and only call Rotessa when cache-miss, or (B) remove the direct call from the webhook path and instead enqueue a background job/task (via your existing job queue) that calls findRotessaFinancialTransactionForWebhook and reconciles the transfer; ensure the webhook handler enqueues the job and handles enqueuing errors separately without blocking the response, and retain the existing error fallback logic in the background worker.convex/payments/recurringSchedules/__tests__/providerManaged.integration.test.ts (1)
27-43: Consider using Vitest'svi.stubEnvinstead of direct process.env manipulation.Direct manipulation of
globalThis.process.envcan cause test isolation issues. Vitest providesvi.stubEnv()for safer environment variable stubbing.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/recurringSchedules/__tests__/providerManaged.integration.test.ts` around lines 27 - 43, The test currently manipulates globalThis.process/env via testGlobal and previousEnv which can break isolation; replace that manual setup with Vitest's vi.stubEnv to set the required env vars (e.g., DISABLE_CASH_LEDGER_HASHCHAIN, DISABLE_GT_HASHCHAIN, ROTESSA_API_KEY) inside your test lifecycle (e.g., call vi.stubEnv(...) in beforeEach to apply the stub and call the returned restore function or restore it in afterEach); remove the testGlobal and previousEnv bookkeeping and any direct writes to process.env in this file so tests rely on vi.stubEnv for safe, isolated environment stubbing.src/components/admin/financial-ledger/ui.tsx (1)
1-2: Redundant React type imports.Both
type * as Reactandtype { ReactNode }are imported. The namespace import is only used forReact.ComponentProps, whileReactNodeis used separately.♻️ Suggested consolidation
-import type * as React from "react"; -import type { ReactNode } from "react"; +import type { ComponentProps, ReactNode } from "react";Then update usages of
React.ComponentPropsto justComponentProps.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/ui.tsx` around lines 1 - 2, Remove the redundant namespace import and consolidate type imports: replace the dual imports with a single named import for React types (e.g., import type { ComponentProps, ReactNode } from "react") and update all usages of React.ComponentProps to ComponentProps (keep ReactNode references as-is). Specifically edit the top-level import lines and change any occurrences of React.ComponentProps<...> in this file to ComponentProps<...>.src/components/admin/financial-ledger/financial-ledger-page.tsx (2)
191-215: Consider stable row identifiers.The fallback to
row-${index}can cause React key instability if rows are reordered or filtered, potentially leading to incorrect row selection state. If the data source can guarantee a unique identifier field, prefer requiring it.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/financial-ledger-page.tsx` around lines 191 - 215, The buildRowId function falls back to an unstable index-based id ("row-{index}") which causes React key/selection instability when rows are reordered or filtered; update buildRowId to require or prefer a stable unique identifier (e.g., obligationId/obligation_id or transferId/transfer_id etc.) and if no explicit unique id exists, derive a deterministic stable id (for example by hashing a concatenation of immutable identifying fields from the row) instead of using the index; locate the function buildRowId and replace the index fallback with a deterministic stable-id strategy or throw/validate to ensure callers provide a unique id.
676-689: Overly broad useMemo dependency.The
searchobject is passed as a dependency, but only specific fields (dateFrom,dateTo,mortgageId,search,showOnlyExceptions,type) are used. This causes unnecessary recalculations when unrelated search fields change.♻️ Suggested fix
const register = useMemo( () => buildAccountRegisterRows({ account: selectedAccount, dateFrom: search.dateFrom, dateTo: search.dateTo, journalLines: snapshot.journalLines, mortgageId: search.mortgageId, search: search.search, showOnlyCorrections: search.showOnlyExceptions, type: search.type, }), - [search, selectedAccount, snapshot.journalLines] + [ + search.dateFrom, + search.dateTo, + search.mortgageId, + search.search, + search.showOnlyExceptions, + search.type, + selectedAccount, + snapshot.journalLines, + ] );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/financial-ledger-page.tsx` around lines 676 - 689, The useMemo for `register` currently depends on the whole `search` object which triggers unnecessary recalculations; update the dependency array for the `useMemo` that builds `register` (calling `buildAccountRegisterRows`) to list only the specific `search` fields used (e.g., `search.dateFrom`, `search.dateTo`, `search.mortgageId`, `search.search`, `search.showOnlyExceptions`, `search.type`) along with `selectedAccount` and `snapshot.journalLines` so the memo only invalidates when those actual inputs change.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 59f79bbb-2d63-4b60-badf-d867e02d1ce8
⛔ Files ignored due to path filters (2)
bun.lockis excluded by!**/*.lockconvex/_generated/api.d.tsis excluded by!**/_generated/**
📒 Files selected for processing (98)
.husky/pre-commitAGENTS.mdCLAUDE.mdconvex/crons.tsconvex/demo/__tests__/ampsE2e.test.tsconvex/demo/__tests__/ampsExecutionModes.test.tsconvex/demo/ampsExecutionModes.tsconvex/fluent.tsconvex/payments/adminDashboard/queries.tsconvex/payments/collectionPlan/__tests__/runner.test.tsconvex/payments/collectionPlan/admin.tsconvex/payments/collectionPlan/execution.tsconvex/payments/collectionPlan/executionContract.tsconvex/payments/collectionPlan/initialScheduling.tsconvex/payments/collectionPlan/mutations.tsconvex/payments/collectionPlan/queries.tsconvex/payments/collectionPlan/runner.tsconvex/payments/recurringSchedules/__tests__/providerManaged.integration.test.tsconvex/payments/recurringSchedules/activation.tsconvex/payments/recurringSchedules/occurrenceIngestion.tsconvex/payments/recurringSchedules/poller.tsconvex/payments/recurringSchedules/providers/registry.tsconvex/payments/recurringSchedules/providers/rotessaRecurring.tsconvex/payments/recurringSchedules/queries.tsconvex/payments/recurringSchedules/types.tsconvex/payments/recurringSchedules/validators.tsconvex/payments/rotessa/api.tsconvex/payments/rotessa/financialTransactions.tsconvex/payments/transfers/providers/registry.tsconvex/payments/transfers/providers/rotessa.tsconvex/payments/transfers/types.tsconvex/payments/webhooks/handleReversal.tsconvex/payments/webhooks/rotessaPad.tsconvex/schema.tsconvex/test/moduleMaps.tsdocs/architecture/rbac-and-permissions.mddocs/architecture/state-machines.mddocs/architecture/unified-payment-rails-technical-design.mddocs/cash-ledger-developer-guide.mddocs/reviews/2026-04-08-security-rbac-review.mddocs/superpowers/plans/2026-04-14-admin-financial-ledger-and-payment-operations.mddocs/technical-design/unified-payment-rails.mddocs/ui-design/admin-financial-ledger-and-payment-operations-screen-spec.mdpackage.jsonreviews/04-11-provider_managed_payment_schedule/74b377d61.mdspecs/15-support-provider-managed-recurring-collection-schedules/PRD.mdspecs/15-support-provider-managed-recurring-collection-schedules/design.mdspecs/15-support-provider-managed-recurring-collection-schedules/gap-analysis.mdspecs/15-support-provider-managed-recurring-collection-schedules/tasks.mdsrc/components/admin/financial-ledger/actions.tsxsrc/components/admin/financial-ledger/csv.tssrc/components/admin/financial-ledger/financial-ledger-page.tsxsrc/components/admin/financial-ledger/format.tssrc/components/admin/financial-ledger/payment-operations-page.tsxsrc/components/admin/financial-ledger/search.tssrc/components/admin/financial-ledger/types.tssrc/components/admin/financial-ledger/ui.tsxsrc/components/admin/financial-ledger/validation.tssrc/components/admin/shell/AdminBreadcrumbs.tsxsrc/components/admin/shell/entity-registry.tssrc/components/demo/amps/hooks.tssrc/components/document-engine/pdf-designer.tsxsrc/hooks/use-app-auth.tssrc/hooks/use-can-do.tssrc/lib/auth-policy.tssrc/lib/auth.tssrc/routeTree.gen.tssrc/routes/__root.tsxsrc/routes/admin/financial-ledger.tsxsrc/routes/admin/payment-operations.tsxsrc/routes/demo.listings.tsxsrc/routes/demo/amps/-execution-modes.tsxsrc/routes/demo/amps/-index.tsxsrc/routes/demo/broker-whitelabel/-components/BrokerWhiteLabelPages.tsxsrc/routes/demo/broker-whitelabel/-components/MortgageApplicationSection.tsxsrc/routes/demo/broker-whitelabel/-components/broker-whitelabel.csssrc/routes/demo/broker-whitelabel/-lib/mock-data.tssrc/routes/demo/broker-whitelabel/-lib/store.tssrc/routes/demo/broker-whitelabel/-lib/types.tssrc/routes/demo/broker-whitelabel/auth.tsxsrc/routes/demo/broker-whitelabel/index.tsxsrc/routes/demo/broker-whitelabel/listings/$listingId.tsxsrc/routes/demo/broker-whitelabel/listings/index.tsxsrc/routes/demo/broker-whitelabel/listings/route.tsxsrc/routes/demo/broker-whitelabel/onboarding.tsxsrc/routes/demo/broker-whitelabel/route.tsxsrc/routes/demo/document-engine/designer.$templateId.tsxsrc/routes/demo/document-engine/route.tsxsrc/routes/demo/rbac-auth/audit.tsxsrc/routes/demo/rbac-auth/onboarding.tsxsrc/test/admin/admin-shell.test.tssrc/test/admin/financial-ledger-utils.test.tssrc/test/auth/auth-policy.test.tssrc/test/auth/middleware/authMiddleware.test.tssrc/test/auth/middleware/requirePermission.test.tssrc/test/convex/payments/helpers.tssrc/test/convex/payments/rotessaManagedRecurringLifecycle.test.tssrc/test/demo/broker-whitelabel.test.ts
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (7)
convex/payments/collectionPlan/admin.ts (1)
419-433: Parallelize independent reads ingetCollectionAttemptto avoid extra latency.At Line 420 and Line 425, the audit log and transition journal fetches are independent, so awaiting them serially adds unnecessary request time.
⚡ Suggested refactor
- const planEntryPromise = ctx.db.get(attempt.planEntryId); - const auditEvents = await auditLog.queryByResource(ctx, { - resourceType: "collectionAttempts", - resourceId: `${args.attemptId}`, - limit: 25, - }); - const transitionJournal = await ctx.db - .query("auditJournal") - .withIndex("by_entity", (q) => - q - .eq("entityType", "collectionAttempt") - .eq("entityId", `${args.attemptId}`) - ) - .collect(); - const planEntry = await planEntryPromise; + const [planEntry, auditEvents, transitionJournal] = await Promise.all([ + ctx.db.get(attempt.planEntryId), + auditLog.queryByResource(ctx, { + resourceType: "collectionAttempts", + resourceId: `${args.attemptId}`, + limit: 25, + }), + ctx.db + .query("auditJournal") + .withIndex("by_entity", (q) => + q + .eq("entityType", "collectionAttempt") + .eq("entityId", `${args.attemptId}`) + ) + .collect(), + ]);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/collectionPlan/admin.ts` around lines 419 - 433, The audit log and transition journal reads in getCollectionAttempt are awaited serially causing extra latency; instead kick off all independent reads (start planEntryPromise as you already do, then start auditEventsPromise via auditLog.queryByResource(...) and transitionJournalPromise via ctx.db.query("auditJournal").withIndex(...).collect()) without awaiting, then await them together (e.g., Promise.all) and assign results to auditEvents, transitionJournal, and planEntry; update references to use these variables (planEntryPromise, auditEventsPromise, transitionJournalPromise) so the three independent requests run in parallel.convex/payments/adminDashboard/queries.ts (2)
989-1005: Consider pagination or filters as data volume grows.The query collects entire tables (
obligations,collectionAttempts,collectionPlanEntries,transferRequests,cash_ledger_accounts,cash_ledger_journal_entries) into memory. While acceptable for initial deployment, this pattern will hit Convex runtime and response-size limits as production data accumulates.The filtered query pattern on line 1119-1122 (
withIndex("by_status", (q) => q.eq("status", "sync_error"))) shows the direction: apply server-side filters and consider cursor-based pagination for row-heavy tabs.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/adminDashboard/queries.ts` around lines 989 - 1005, The handler getPaymentOperationsDashboardSnapshot is collecting entire tables into memory (calls like ctx.db.query("obligations").collect(), ctx.db.query("collectionAttempts").collect(), ctx.db.query("collectionPlanEntries").collect(), ctx.db.query("transferRequests").collect(), ctx.db.query("cash_ledger_accounts").collect()), which will break as data grows; update these queries to apply server-side filters and/or cursor-based pagination (use .withIndex(...).eq(...) or range queries and a limit) and return paged results or aggregated summaries instead of full .collect() for heavy tables, ensuring you preserve the same shape the caller expects (or add pagination tokens) so large result sets do not hit Convex runtime/response-size limits.
159-174: Optional: Parallelize document fetches.
fetchDocsByIdsawaits eachctx.db.get(id)sequentially. While Convex optimizes reads internally, you could batch these explicitly for marginally better throughput:♻️ Optional parallel fetch
async function fetchDocsByIds< TableName extends TableNames, TDoc extends { _id: Id<TableName> }, >( ids: readonly Id<TableName>[], loadDoc: (id: Id<TableName>) => Promise<TDoc | null> ) { const docsById = new Map<string, TDoc>(); - for (const id of ids) { - const doc = await loadDoc(id); - if (doc) { - docsById.set(String(doc._id), doc); - } - } + const docs = await Promise.all(ids.map(loadDoc)); + for (const doc of docs) { + if (doc) { + docsById.set(String(doc._id), doc); + } + } return docsById; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@convex/payments/adminDashboard/queries.ts` around lines 159 - 174, fetchDocsByIds currently awaits loadDoc for each id sequentially which can be slower; change it to kick off all loadDoc calls concurrently (map ids to Promise<TDoc|null> using loadDoc), await Promise.all on that array, then iterate results to populate the existing docsById Map (use String(doc._id) as before). Update references to the function fetchDocsByIds and its loadDoc parameter only—no other behavior changes.src/components/admin/financial-ledger/financial-ledger-page.tsx (2)
637-666: Narrow useMemo dependencies to avoid unnecessary recalculation.The
filteredAccountsmemo uses the entiresearchobject as a dependency, but only reads specific fields (status,mortgageId,borrowerId,lenderId,showOnlyExceptions,search). Destructuring the needed fields would make the dependency clearer and avoid recalculation when unrelated search fields change (liketab).♻️ Narrow dependency array
const filteredAccounts = useMemo( () => snapshot.chartOfAccounts.filter((row) => { - if (search.status && row.accountFamily !== search.status) { + if (accountFamilyFilter && row.accountFamily !== accountFamilyFilter) { return false; } // ... rest of filter logic }), - [search, snapshot.chartOfAccounts] + [ + search.status, + search.mortgageId, + search.borrowerId, + search.lenderId, + search.showOnlyExceptions, + search.search, + snapshot.chartOfAccounts, + ] );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/financial-ledger-page.tsx` around lines 637 - 666, The useMemo for filteredAccounts currently depends on the entire search object causing unnecessary recalculations; destructure only the used fields (status, mortgageId, borrowerId, lenderId, showOnlyExceptions, search) from search and replace the dependency with those specific variables plus snapshot.chartOfAccounts so filteredAccounts recomputes only when those actual inputs change; update the useMemo signature and its dependency array accordingly while keeping the filter logic in the same function.
806-831: Consider constructing the validation row directly.
addManualValidationRowbuilds a CSV string only to parse it back into a row object. Constructing theValidationExpectedRowdirectly would be simpler and avoid CSV parsing edge cases (e.g., commas in values).♻️ Direct row construction
function addManualValidationRow() { - const parsedAmount = validationDraft.expectedAmount.trim(); - const rows = parseValidationCsv( - [ - "subject_type,subject_id,metric,effective_date,expected_amount", - [ - validationDraft.subjectType, - validationDraft.subjectId, - validationDraft.metric, - validationDraft.effectiveDate, - parsedAmount, - ].join(","), - ].join("\n") - ); - if (rows.length === 0) { - return; - } + const { subjectType, subjectId, metric, effectiveDate, expectedAmount } = validationDraft; + if (!subjectType.trim() || !subjectId.trim()) { + return; + } + const amountCents = parseCurrencyInputToCents(expectedAmount); + const row: ValidationExpectedRow = { + subjectType: subjectType.trim(), + subjectId: subjectId.trim(), + metric: metric.trim() || "closing_balance", + effectiveDate: effectiveDate || undefined, + expectedAmountCents: amountCents, + }; - setExpectedRows((current) => [...current, ...rows]); + setExpectedRows((current) => [...current, row]); setValidationDraft({ effectiveDate: "", expectedAmount: "", metric: validationDraft.metric, subjectId: "", subjectType: validationDraft.subjectType, }); }Note: This requires importing
parseCurrencyInputToCentsor a similar helper, and knowing the exact shape ofValidationExpectedRow.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/financial-ledger-page.tsx` around lines 806 - 831, The function addManualValidationRow currently builds a CSV and re-parses it via parseValidationCsv which is unnecessary and brittle; instead construct a ValidationExpectedRow object directly (use validationDraft fields and convert expectedAmount with parseCurrencyInputToCents or the existing currency helper) and push it via setExpectedRows((cur)=>[...cur, newRow]); then reset validationDraft using setValidationDraft as currently done. Replace the parseValidationCsv call and CSV construction with direct object creation and ensure the newRow matches the ValidationExpectedRow shape used elsewhere.src/components/admin/financial-ledger/actions.tsx (2)
48-59: Consider memoizing the reset callback to stabilize effect dependencies.The
resetcallback in theuseEffectdependency array is an inline function, so its identity changes on every render. While the effect guards with!open, the effect still re-runs unnecessarily. This is minor since the behavior is correct, but memoizing withuseCallbackin the consuming components would be cleaner.♻️ Optional: Stabilize reset callback in consumers
In
WaiveBalanceDialogandWriteOffBalanceDialog:+import { useCallback } from "react"; ... +const resetForm = useCallback(() => { + setAmountInput(centsInputValue(defaultAmountCents)); + setReason(""); + // For WriteOffBalanceDialog, also: setIdempotencyKey(crypto.randomUUID()); +}, [defaultAmountCents]); + -const { open, setOpen, setSubmitting, submitting } = useDialogState(() => { - setAmountInput(centsInputValue(defaultAmountCents)); - setReason(""); -}); +const { open, setOpen, setSubmitting, submitting } = useDialogState(resetForm);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/actions.tsx` around lines 48 - 59, The effect in useDialogState re-runs when the inline reset prop identity changes; stabilize it by memoizing the reset callback in the consuming components (e.g., WaiveBalanceDialog and WriteOffBalanceDialog) using React.useCallback so the reference passed into useDialogState remains stable, then pass that memoized reset to useDialogState; no changes needed inside useDialogState itself other than relying on the stable reset prop in its dependency array.
61-107: Consider adding idempotency key for waiver operations.
WaiveBalanceDialogdoesn't passidempotencyKeyeven though the mutation accepts it optionally (per context snippet 2). While not a bug, this creates inconsistent retry semantics between waive and write-off operations. A network timeout followed by retry could create duplicate waivers.♻️ Add idempotency key to WaiveBalanceDialog
export function WaiveBalanceDialog({ defaultAmountCents, obligationId, }: { defaultAmountCents: number; obligationId: Id<"obligations">; }) { const waiveBalance = useMutation( api.payments.cashLedger.mutations.waiveObligationBalance ); const [amountInput, setAmountInput] = useState( centsInputValue(defaultAmountCents) ); const [reason, setReason] = useState(""); + const [idempotencyKey, setIdempotencyKey] = useState(() => + crypto.randomUUID() + ); const { open, setOpen, setSubmitting, submitting } = useDialogState(() => { setAmountInput(centsInputValue(defaultAmountCents)); setReason(""); + setIdempotencyKey(crypto.randomUUID()); }); async function handleSubmit() { ... try { await waiveBalance({ amount, + idempotencyKey, obligationId, reason: reason.trim(), });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/admin/financial-ledger/actions.tsx` around lines 61 - 107, WaiveBalanceDialog should generate and pass an idempotency key to the waiveObligationBalance mutation to avoid duplicate waivers: add an idempotencyKey (e.g., crypto.randomUUID() or uuid.v4()) to component state or a ref, reset/regenerate that key when the dialog is opened (inside the useDialogState initializer that currently resets amountInput and reason), and include that idempotencyKey in the payload you pass to waiveBalance in handleSubmit; reference the WaiveBalanceDialog component, the useDialogState initializer, the handleSubmit function, and the waiveObligationBalance / waiveBalance mutation when making these changes.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/components/admin/financial-ledger/financial-ledger-page.tsx`:
- Around line 795-804: handleValidationUpload currently calls file.text()
without catching errors, so unreadable/oversized files will reject silently;
wrap the await file.text() and subsequent parseValidationCsv(...) in a try/catch
inside handleValidationUpload, on error log/report the error and set an
appropriate UI error state or show the existing user-facing error/toast (do not
let the rejection propagate silently), and only call setExpectedRows and
setActiveValidationRows when parsing succeeds; also ensure callers of
handleValidationUpload aren’t swallowing its promise (avoid using void when
invoking it) so any remaining errors are surfaced.
---
Nitpick comments:
In `@convex/payments/adminDashboard/queries.ts`:
- Around line 989-1005: The handler getPaymentOperationsDashboardSnapshot is
collecting entire tables into memory (calls like
ctx.db.query("obligations").collect(),
ctx.db.query("collectionAttempts").collect(),
ctx.db.query("collectionPlanEntries").collect(),
ctx.db.query("transferRequests").collect(),
ctx.db.query("cash_ledger_accounts").collect()), which will break as data grows;
update these queries to apply server-side filters and/or cursor-based pagination
(use .withIndex(...).eq(...) or range queries and a limit) and return paged
results or aggregated summaries instead of full .collect() for heavy tables,
ensuring you preserve the same shape the caller expects (or add pagination
tokens) so large result sets do not hit Convex runtime/response-size limits.
- Around line 159-174: fetchDocsByIds currently awaits loadDoc for each id
sequentially which can be slower; change it to kick off all loadDoc calls
concurrently (map ids to Promise<TDoc|null> using loadDoc), await Promise.all on
that array, then iterate results to populate the existing docsById Map (use
String(doc._id) as before). Update references to the function fetchDocsByIds and
its loadDoc parameter only—no other behavior changes.
In `@convex/payments/collectionPlan/admin.ts`:
- Around line 419-433: The audit log and transition journal reads in
getCollectionAttempt are awaited serially causing extra latency; instead kick
off all independent reads (start planEntryPromise as you already do, then start
auditEventsPromise via auditLog.queryByResource(...) and
transitionJournalPromise via
ctx.db.query("auditJournal").withIndex(...).collect()) without awaiting, then
await them together (e.g., Promise.all) and assign results to auditEvents,
transitionJournal, and planEntry; update references to use these variables
(planEntryPromise, auditEventsPromise, transitionJournalPromise) so the three
independent requests run in parallel.
In `@src/components/admin/financial-ledger/actions.tsx`:
- Around line 48-59: The effect in useDialogState re-runs when the inline reset
prop identity changes; stabilize it by memoizing the reset callback in the
consuming components (e.g., WaiveBalanceDialog and WriteOffBalanceDialog) using
React.useCallback so the reference passed into useDialogState remains stable,
then pass that memoized reset to useDialogState; no changes needed inside
useDialogState itself other than relying on the stable reset prop in its
dependency array.
- Around line 61-107: WaiveBalanceDialog should generate and pass an idempotency
key to the waiveObligationBalance mutation to avoid duplicate waivers: add an
idempotencyKey (e.g., crypto.randomUUID() or uuid.v4()) to component state or a
ref, reset/regenerate that key when the dialog is opened (inside the
useDialogState initializer that currently resets amountInput and reason), and
include that idempotencyKey in the payload you pass to waiveBalance in
handleSubmit; reference the WaiveBalanceDialog component, the useDialogState
initializer, the handleSubmit function, and the waiveObligationBalance /
waiveBalance mutation when making these changes.
In `@src/components/admin/financial-ledger/financial-ledger-page.tsx`:
- Around line 637-666: The useMemo for filteredAccounts currently depends on the
entire search object causing unnecessary recalculations; destructure only the
used fields (status, mortgageId, borrowerId, lenderId, showOnlyExceptions,
search) from search and replace the dependency with those specific variables
plus snapshot.chartOfAccounts so filteredAccounts recomputes only when those
actual inputs change; update the useMemo signature and its dependency array
accordingly while keeping the filter logic in the same function.
- Around line 806-831: The function addManualValidationRow currently builds a
CSV and re-parses it via parseValidationCsv which is unnecessary and brittle;
instead construct a ValidationExpectedRow object directly (use validationDraft
fields and convert expectedAmount with parseCurrencyInputToCents or the existing
currency helper) and push it via setExpectedRows((cur)=>[...cur, newRow]); then
reset validationDraft using setValidationDraft as currently done. Replace the
parseValidationCsv call and CSV construction with direct object creation and
ensure the newRow matches the ValidationExpectedRow shape used elsewhere.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 020b48b6-0389-47b9-84a1-8bad4ecb2a48
📒 Files selected for processing (26)
.husky/pre-commitconvex/demo/__tests__/ampsExecutionModes.test.tsconvex/demo/ampsExecutionModes.tsconvex/payments/adminDashboard/__tests__/queries.test.tsconvex/payments/adminDashboard/queries.tsconvex/payments/collectionPlan/__tests__/execution.test.tsconvex/payments/collectionPlan/admin.tsconvex/payments/collectionPlan/execution.tsconvex/payments/recurringSchedules/__tests__/providerManaged.integration.test.tsconvex/payments/recurringSchedules/activation.tsconvex/payments/transfers/types.tsconvex/test/moduleMaps.tsdocs/architecture/rbac-and-permissions.mdsrc/components/admin/financial-ledger/actions.tsxsrc/components/admin/financial-ledger/csv.tssrc/components/admin/financial-ledger/financial-ledger-page.tsxsrc/components/admin/financial-ledger/format.tssrc/components/admin/financial-ledger/payment-operations-page.tsxsrc/components/admin/financial-ledger/types.tssrc/components/admin/financial-ledger/validation.tssrc/components/document-engine/pdf-designer.tsxsrc/lib/auth-policy.tssrc/routes/admin/financial-ledger.tsxsrc/test/admin/financial-ledger-utils.test.tssrc/test/auth/auth-policy.test.tssrc/test/auth/middleware/requirePermission.test.ts
✅ Files skipped from review due to trivial changes (5)
- docs/architecture/rbac-and-permissions.md
- src/components/admin/financial-ledger/format.ts
- src/components/admin/financial-ledger/payment-operations-page.tsx
- src/components/admin/financial-ledger/types.ts
- src/components/admin/financial-ledger/csv.ts
🚧 Files skipped from review as they are similar to previous changes (3)
- .husky/pre-commit
- convex/payments/transfers/types.ts
- convex/demo/ampsExecutionModes.ts
👮 Files not reviewed due to content moderation or server errors (6)
- convex/test/moduleMaps.ts
- convex/payments/collectionPlan/execution.ts
- convex/payments/collectionPlan/tests/execution.test.ts
- convex/payments/recurringSchedules/activation.ts
- convex/demo/tests/ampsExecutionModes.test.ts
- convex/payments/recurringSchedules/tests/providerManaged.integration.test.ts
| async function handleValidationUpload(event: ChangeEvent<HTMLInputElement>) { | ||
| const file = event.target.files?.[0]; | ||
| if (!file) { | ||
| return; | ||
| } | ||
| const text = await file.text(); | ||
| const rows = parseValidationCsv(text); | ||
| setExpectedRows(rows); | ||
| setActiveValidationRows(rows); | ||
| } |
There was a problem hiding this comment.
Add error handling for file reading.
file.text() can throw if the file is unreadable or too large, but the error isn't caught. Combined with void at line 1604 swallowing the promise rejection, file read failures will be silent to the user.
🛡️ Proposed fix to handle file read errors
async function handleValidationUpload(event: ChangeEvent<HTMLInputElement>) {
const file = event.target.files?.[0];
if (!file) {
return;
}
- const text = await file.text();
- const rows = parseValidationCsv(text);
- setExpectedRows(rows);
- setActiveValidationRows(rows);
+ try {
+ const text = await file.text();
+ const rows = parseValidationCsv(text);
+ setExpectedRows(rows);
+ setActiveValidationRows(rows);
+ } catch (error) {
+ toast.error(
+ error instanceof Error ? error.message : "Failed to read file"
+ );
+ }
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/admin/financial-ledger/financial-ledger-page.tsx` around lines
795 - 804, handleValidationUpload currently calls file.text() without catching
errors, so unreadable/oversized files will reject silently; wrap the await
file.text() and subsequent parseValidationCsv(...) in a try/catch inside
handleValidationUpload, on error log/report the error and set an appropriate UI
error state or show the existing user-facing error/toast (do not let the
rejection propagate silently), and only call setExpectedRows and
setActiveValidationRows when parsing succeeds; also ensure callers of
handleValidationUpload aren’t swallowing its promise (avoid using void when
invoking it) so any remaining errors are surfaced.
fcc5e3b to
f5dc891
Compare
## Summary - adds provider-managed recurring payment schedules with Rotessa ingestion, poller, schema, and cron wiring - introduces admin financial-ledger and payment-operations tooling - centralizes auth and RBAC policy helpers, including `admin:access` permission semantics - includes supporting docs, tests, and the document-engine editor resize cleanup <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **New Features** * Added admin dashboards for Payment Operations and Financial Ledger, providing payment oversight, ledger reconciliation, and operational health monitoring. * Introduced support for provider-managed recurring collection schedules with external provider polling and webhook ingestion. * Added new admin query surfaces for payment analytics and ledger snapshots. * **Documentation** * Added RBAC and permissions architecture documentation. * Added state machines and payment rails design guides. * Added admin financial ledger and payment operations screen specifications. <!-- end of auto-generated comment: release notes by coderabbit.ai -->

Summary
admin:accesspermission semanticsSummary by CodeRabbit
New Features
Documentation