u/Rhinnii

I used multiple Claude Code instances to build and test a Laravel package across 3 production codebases

I used multiple Claude Code instances to build and test a Laravel package across 3 production codebases

I posted recently on Reddit about building a fluent validation rule builder for Laravel (laravel-fluent-validation). Since then I also released a Rector companion package for automated migration. Instead of the usual pre-release-and-wait cycle, I ran Claude Code on the package repo and on three production Laravel codebases simultaneously and let the Claude instances work together.

The workflow

claude-peers is an MCP server for Claude Code. Each instance running on your machine can discover other instances, see what they're working on, and send messages. They don't share context. Each has its own conversation with full codebase access.

In practice it works like this: the package peer tags a new release. It sends a message to the three codebase peers saying "0.4.5 tagged, fixes the parallel-worker race, please re-verify." Each codebase peer receives the message, pulls the new version, runs the migration, runs their tests, and sends back results. If something breaks, the response includes the exact error, the file, and usually a theory about why. The package peer reads that, asks follow-up questions if needed, fixes the issue, and the loop continues.

One thing I didn't expect was how quickly the peers developed their own review dynamic. They would challenge each other's assumptions, ask for evidence, and sometimes reach consensus before coming back with a recommendation.

I had four terminals open:

  • The package repo, building features, writing tests, shipping releases
  • Three production codebases, each a real Laravel app with its own validation patterns, framework integrations, and test suites

Everything runs locally. Claude Code works on local clones of each codebase, with the same filesystem access you'd have in your terminal. No production servers, no remote environments, no secrets exposed to AI.

The interesting part was what the peers caught that tests and synthetic fixtures couldn't:

  • One app has 108 FormRequests and uses rules() as a naming convention on Actions and Collections. The skip log grew to 2,988 entries / 777KB. On a smaller codebase you'd never notice.
  • Another app runs 15 parallel Rector workers. The skip log's truncate flag was per-process, so every worker wiped the others' entries. Synthetic fixtures run single-process. This bug doesn't exist there.
  • The same app runs Filament alongside Livewire. Five components use Filament's InteractsWithForms trait which defines its own validate(). Inserting the package's trait would have been a fatal collision on first render.
  • A third app found that 5/7 of its Livewire files had dead #[Validate] attributes coexisting with explicit validate([...]) calls. Nobody anticipated that pattern.

Wrote up the full workflow, what worked, and when I'd use it (link in comments).

u/Rhinnii — 7 days ago
🔥 Hot ▲ 122 r/laravel

Laravel's wildcard validation is O(n²), here's a fix

I was profiling a slow import endpoint. 100 items, 47 fields each with exclude_unless and required_if. Endpoint took 3.4 seconds. I assumed database queries. Validation alone was 3.2s.

When you write items.*.name => required|string|max:255, Laravel's explodeWildcardRules() flattens data with Arr::dot() and matches regex patterns against every key. 500 items × 7 fields = 3,500 concrete rules, and the expansion is O(n²). Conditional rules like exclude_unless make it worse because they trigger dependent-rule resolution on every attribute.

I submitted 10 performance PRs to laravel/framework. Four merged, the six validation ones were all closed. So I built it as a package: laravel-fluent-validation.

Add use HasFluentRules to your FormRequest, keep your existing rules. The wildcard expansion is replaced with O(n) tree traversal. For 25 common rules it compiles PHP closures (is_string($v) && strlen($v) <= 255 instead of rule parsing + method dispatch + BigNumber). If the value passes, Laravel's validator never sees it. Fails go through Laravel for the correct error message. It also pre-evaluates exclude_unless/exclude_if before validation starts, so instead of 4,700 rules each checking conditions, the validator only sees the ~200 that actually apply.

class ImportRequest extends FormRequest
{
    use HasFluentRules;
}

Benchmarks (CI, PHP 8.4, OPcache, median of 3 runs):

Scenario Laravel With trait Speedup
500 items × 7 simple fields ~200ms ~2ms 97x
500 items × 7 mixed fields (string + date) ~200ms ~20ms 10x
100 items × 47 conditional fields ~3,200ms ~83ms 39x

It's already noticeable with a handful of wildcard inputs that each have a few rules. The package works with Livewire and Filament, is Octane-safe and has a large set of tests.

https://github.com/SanderMuller/laravel-fluent-validation

Performance issue tracked upstream: laravel/framework issue 49375

reddit.com
u/Rhinnii — 9 days ago