The pivot
How losing a 50,000-line engine became the best thing that could have happened.
I spent nine months building an MMO engine in Rust. Signed-distance-field terrain, dual-contouring, hierarchical clip maps, character customization, procedural audio, network interest management, the whole 282-test suite. It ran at 35 fps with dynamic terrain editing. It was real.
On a Friday, I was cleaning up a directory. I ran a command that was supposed to clear out a subdirectory. It cleared more than I intended. The Rust source was not in git — that repository was for the marketing website that happened to live in the same folder — and my Time Machine backup had stopped running weeks earlier without me noticing.
The compiled binary survived in target/release/ractr-engine. That binary still runs. I could launch it and walk around, chop terrain, customize characters. But I couldn't change it. Nine months of code, frozen.
The recovery attempt
I spent several hours mining Claude Code session logs — every time an AI assistant had read a file over the previous year, the contents were sitting in a JSONL transcript on disk. I wrote a script that walked every session, reconstructed the most recent version of each file it had seen, and dumped them to a _recovered/ directory.
I got 113 files back. About 25–30% of the engine. Most of the biggest files came back intact — the 1,100-line world generator, the 956-line mob AI, the character creation system. But core subsystems were still missing: most of the audio pipeline, the SDF primitives, half the network code, and — tellingly — none of the Cargo.toml files, which meant the recovered code couldn't compile even as a starting point for rebuilding.
I sat with it for about an hour. Then I decided not to rebuild.
Why not rebuild
The recovered code was a perfect cross-section of what an SDF-based MMO engine looks like in 2026. And staring at 25% of it, I could see something I hadn't seen clearly while I was writing the other 75%: the engine was fighting itself.
- The terrain renderer had its own voxel grid at three resolutions.
- The physics engine had its own collision mesh.
- The lighting pipeline had its own probe volumes.
- The character customization had its own texture atlas.
- The network code had its own interest manager.
Each of these was doing something similar — maintaining a spatial acceleration structure over "the world" — but each had its own representation, its own update cadence, its own failure modes. Every edit to the terrain required updating all of them. Every frame had to keep them in sync. The 6-second stall when I built a dirt mound was exactly this: five separate data structures being rebuilt serially.
That's what classical game engines are. Decades of accreted representations, each good at one thing, glued together with synchronization code. Unreal, Unity, Godot — they're all variations on this pattern. The engine I lost was a small, personal version of the same architecture.
The paper on my desk
I have another thing I work on, which is a physics paper. The paper is called Density Field Dynamics: A Complete Unified Theory. It postulates that the universe can be described by a single scalar field ψ(x, t) on flat three-dimensional space, and that gravity, optics, and the passage of time all emerge from that one field. It's 208 pages. It passes every experimental test of gravity that has been performed to date — solar-system, binary pulsar, LIGO, EHT. It derives the fine-structure constant 1/137 from topology. It has zero free parameters once you measure the Hubble constant.
I'd been holding DFD and the game engine in separate mental compartments. The physics paper was abstract. The game engine was concrete. On that Friday, with the broken binary sitting in target/release/ and a partial recovery sitting in _recovered/, those compartments dissolved.
If nature runs the universe on one scalar field — and passes every solar system test, every galactic rotation curve, every gravitational wave observation while doing so — then the most efficient possible representation of a game world is that same field. Not an approximation of it. Not a specialized chunk of it. The actual thing.
Every pixel on my screen, every chunk of terrain, every projectile, every character would be a query against one 3D scalar field. Light curves through ψ by Fermat's principle. Matter falls through ψ by Hamilton's principle. Both are the same Euler–Lagrange equation applied to the same action, which is the same one that makes ψ exist in the first place. The consistency that classical engines spend massive engineering effort to approximate would be automatic.
The new engine
I started from scratch, in a new dfd-engine/ crate. New Cargo.toml. wgpu 23. Rust 2024. No physics engine. No mesh system. No lighting probes. Three GPU buffers: ρ (matter density), ψ (the scalar field), ψ_next (for ping-pong). Four GPU kernels: Jacobi sweep for the field equation, eikonal raymarch for the renderer, trilinear sampling helper, particle integrator.
That's the whole engine.
The field equation is:
∇·[ μ(|∇ψ|/a★) ∇ψ ] = −(8πG/c²) (ρ − ρ̄)
It's a quasilinear elliptic PDE. I solve it with Jacobi relaxation on the GPU. With the μ-function set to μ(x) = x/(1+x) (the unique form derived from S³ Chern–Simons topology in the paper; not a fit), the field equation is convex — which means the relaxation provably converges, there are no failure modes, no constraint explosions, no tuning. Editing the world means writing to ρ and letting ψ reconverge over a handful of frames.
The rendering is one compute shader. For each pixel, march a ray through the field. At each step, sample ψ and ∇ψ; update the ray direction via d/ds(n·dir) = ∇n where n = e^ψ; advance. Surface hit when ρ exceeds a threshold. Shade with a local gradient normal. Miss = sky. That's the entire renderer. Light bends around mass because the eikonal equation curves it, not because of a postprocess effect.
The physics is one line: vel += (c²/2)∇ψ · dt; pos += vel · dt. Newton's first law (coasting at constant velocity when no field is present) and second law (F=ma toward mass) both fall out because both are the same action principle evaluated at different scales. Inertia isn't a separate system. You can't forget to implement it.
What I got back
Four days after the loss:
- The GPU Jacobi solver converges a 128³ nonlinear elliptic PDE to <0.1% residual in about 160 milliseconds.
- The engine reproduces four independent predictions of general relativity — far-field ψ decay, gravitational light deflection, Kepler's third law, gravitational time dilation — with errors of 1.7%, 7.4%, 3.3%, and 0.01% respectively.
- There's a playable planet. Procedural Earth-like biomes, solar corona, phased moon, build/destroy with live field reconvergence, first-person player who falls at 9.8 m/s² and walks on the surface.
- The entire engine is about 1,800 lines of Rust plus 500 lines of WGSL. The old engine was roughly 50,000 lines of Rust. The new one is more capable than the old one along almost every dimension that matters — it does real gravitational optics, it can edit at no cost, it has no synchronization bugs because there's nothing to synchronize.
The lesson
A classical game engine is a cathedral. Each subsystem is a finely-worked stone, and the whole thing stands because of the pressure of every other stone. If you lose some of the stones, the cathedral is incomplete; you either rebuild the missing ones or the whole thing falls down.
A physics-native engine is a single principle, instantiated on a GPU. You can't lose "some of it." Either the principle is right and the instantiation is correct, or it isn't. And when it is, everything — rendering, physics, time, inertia, optics — falls out of the same data. There's nothing to lose, because there's nothing separate to keep.
Losing the old engine forced me to notice that the cathedral was the wrong architecture for the problem. I couldn't have gotten here by refactoring. The steady-state of fighting classical complexity is so much work that you never have time to question whether the complexity was necessary.
It was never necessary. The universe proves that every day — it runs on one field and produces everything we see.
The new engine is the first time I've felt like the tool and the problem are made of the same stuff. Next post is about why that matters.