New Earth

A planet you can walk on. Biomes derived from latitude and noise. A sun with a real corona. A moon with phases that fall out of geometry. Build and destroy with ψ reconverging live. Three hours after gates passed.

The pipeline

Each frame of New Earth runs four GPU kernels and one CPU integrator:

  1. Jacobi sweep (×8 per frame) — relax ψ on the 128³ grid to keep it converged as ρ evolves. ~1 ms total.
  2. Eikonal raymarch — one workgroup per 8×8 pixel tile at 1280×720. For each pixel, march a ray through ψ with d/ds(n·dir) = ∇n. Surface hit = ρ > threshold. Shade with biome albedo and Lambertian sun/moon lighting. ~10 ms.
  3. Blit — upscale to the 1920×1080 swapchain with linear sampling. <1 ms.
  4. Readback ψ (on edits only) — for the CPU player physics.
  5. Player integrator (CPU)a = (c²/2) ∇ψ, semi-implicit Euler, binary-search collision against ρ.

That's the whole engine, in the order it runs.

The Earth surface

The classical approach is to build a terrain mesh, texture it, light it. DFD doesn't have meshes, so the surface appearance has to come from where the ray hits the ρ field. The hit position tells us latitude and local procedural noise; from those two facts we derive a biome color per pixel.

The full WGSL logic:

fn earth_albedo(hit: vec3<f32>) -> vec3<f32> {
    let r = max(length(hit), 1e-4);
    let lat = hit.y / r;         // −1 south, +1 north
    let abs_lat = abs(lat);

    // Ice caps near the poles
    if (abs_lat > 0.82) {
        let t = clamp((abs_lat − 0.82) / 0.12, 0.0, 1.0);
        return mix(tundra, snow, t);
    }

    // Continent noise from FBM on the unit sphere
    let nrm = hit / r;
    let continent = fbm(nrm * 2.2 + vec3(3.1, 7.7, 1.3));

    if (continent < 0.48) {
        // Ocean, darker toward the middle of large bodies
        let depth = (0.48 − continent) / 0.48;
        return mix(shallow, deep_ocean, depth);
    } else {
        // Land: forest → desert (by latitude) → rocky (by noise)
        let desert_band = smoothstep(0.50, 0.70, abs_lat);
        let base = mix(forest, desert, desert_band);
        let rocky = smoothstep(0.72, 0.90, continent);
        return mix(base, rock, rocky);
    }
}

No textures. No mesh UVs. No terrain generation pass. The planet's appearance is a deterministic function of the ray's hit position, computed per pixel, 2.25 million times per frame. You could rotate the planet and the continents would stay locked to their latitudes because they're derived from the 3D hit position, not from any texture coordinate.

The sun's corona

A real sun has layered atmospheric structure visible from space: a bright photosphere (the "surface"), a red chromosphere at its rim, and a diffuse corona extending several solar radii. The angle dot product between the ray direction and the sun direction makes all of this one shader:

let sun_c = dot(dir, sun_dir);

// Outer corona: wide, soft, pearly
col += pearl * pow(max(sun_c, 0.0), 24.0)  * 0.35 * flicker;
// Mid corona: warmer, tighter
col += warm  * pow(max(sun_c, 0.0), 96.0)  * 0.90 * flicker;
// Chromosphere ring (reddish band just outside the disc)
let chromo = smoothstep(0.9985, 0.9992, sun_c)
           * (1.0 − smoothstep(0.9992, 0.9997, sun_c));
col += red   * chromo * 1.4;
// Photosphere disc
col += yellow * smoothstep(0.9993, 0.9998, sun_c);
// Hot core
col += white  * smoothstep(0.99985, 1.0, sun_c);

The flicker function is low-frequency sin-noise in the ray direction modulated by time, which makes the corona subtly breathe rather than sit statically in the sky. Total cost: 6 pow() calls and 2 smoothsteps per miss-pixel, which is a rounding error compared to the 384-step raymarch that got us to the miss.

The moon

I initially rendered the moon as a single bright disc. It looked like a second sun. The fix was to treat the moon as a real lit sphere: for rays whose direction passes inside the moon's angular disc, compute the local surface normal on the sphere, shade it with Lambertian lighting from the sun, and add a touch of earthshine.

if (moon_c > moon_inner_cos) {
    // Local coordinate system on the disc
    let tangent = normalize(dir − moon_dir * moon_c);
    let r_local = off_axis / disc_radius_sin;  // 0 center, 1 edge
    let z_local = sqrt(1.0 − r_local * r_local);

    // Surface normal on the spherical moon at this pixel
    let normal = normalize(moon_dir * z_local + tangent * r_local);

    // Lambertian lighting → terminator + phases for free
    let sun_lit = max(dot(normal, sun_dir), 0.0);

    // Mare / highland texture from 3D noise on the surface
    let mare_noise = fbm(normal * 3.2 + vec3(1.7, 4.1, 9.3));
    let mare_mask = smoothstep(0.38, 0.56, mare_noise);
    let albedo = mix(highland, mare_color, mare_mask);

    let earthshine = vec3(0.035, 0.050, 0.060);
    let surface = albedo * (sun_lit * 0.85 + 0.04) + earthshine;
    let limb = mix(0.8, 1.0, z_local);  // subtle limb darkening
    col = mix(col, surface * limb, moon_mask);
}

Moon phases fall out of this with no extra code. If the sun is on the same side of the planet as the moon (dot product positive), the side facing us is in shadow — new moon. If the sun is opposite — full moon. Every angle in between gets the correct terminator because the Lambertian dot(normal, sun_dir) is computed per surface pixel. The mare patterns drift with the moon's rotation just from the geometry.

Orbital mechanics

The sun and moon don't sit still. They orbit with periods proportional to the real solar system:

Over one minute of real time, you watch about three full "lunar cycles" while the sun moves a quarter of the way across the sky. The moon passes through every phase, correctly, because the geometry is right.

Space is space

An early version rendered blue sky even from outside the atmosphere. The fix was to compute, per ray, the closest approach of that ray to the planet center, and mix between atmosphere and space based on whether the closest approach is inside the atmosphere shell:

let t_star = max(−dot(ray_origin, dir), 0.0);
let closest = ray_origin + dir * t_star;
let r_star = length(closest);
let atmos_factor = 1.0 − smoothstep(R_planet, 1.25 * R_planet, r_star);
// mix(space_with_stars, atmosphere_blue, atmos_factor)

This is physically what scattering does: rays that skim the atmosphere pick up blue, rays that stay in vacuum don't. Stars are a procedural hash on quantized ray directions — cheap, stable, starfield-looking. Flying out from the Earth now gives you a smooth transition from blue sky to black space with a blue halo around the planet's limb, which is what astronauts actually see.

Build and destroy

Left click digs. Right click builds. Both operations modify ρ in a sphere around the aim point, re-upload the ρ buffer to the GPU (4 MB at 128³, done in microseconds), and run 400 extra Jacobi sweeps to let ψ reconverge to the new mass distribution. Plus 8 sweeps every frame baseline so the field stays warm.

What makes this cheap and stable is that ψ's field equation is convex (paper Theorem III.7). Relaxing toward the minimum is monotonic — you can't overshoot, you can't blow up. Add mass, the field reconverges. Subtract mass, the field reconverges. No constraint solver, no stabilization hacks, no warmstarting.

What makes it cool is that the hyperbolic dynamic form of the field equation (paper Theorem III.6) propagates changes at exactly c. When you dig a crater, the gravitational effect of the missing mass ripples outward at the speed of light — meaning a distant observer sees orbits near the crater shift a fraction of a second after the edit. Causality is built into the PDE itself. In the engine, it shows up as a smooth visual ripple. Physically correct and aesthetically interesting in the same frame.

The player

The player is a test particle:

a = (c²/2) · ∇ψ(player_position)
vel += a * dt
pos += vel * dt

Newton's first law — coasting when no field is present — is automatic. Newton's second law is the code you just saw. Collision with the planet is a binary search along the path for the ρ threshold crossing. When grounded, the normal component of gravity is canceled by the ground reaction force, which is modeled directly by subtracting it from the net acceleration. No bouncing, no clipping, no rigid-body solver.

Friction when walking is a tangential velocity decay applied per-frame. Jumping is an instantaneous impulse along the local "up" (which is −∇ψ normalized — it points away from mass, not necessarily along the world Y axis; if you walked to the equator you would still jump "up" relative to the planet, because the field says so).

What runs at 60 fps

At 1280×720 internal (upscaled to 1920×1080 on display):

On a single MacBook Pro M3. No GPU rasterization in the traditional sense — the entire frame is compute and a single blit. No mesh, no BVH, no physics engine, no lightmap.

What comes next

The engine is doing things that aren't easy to do in Unreal or Unity, but the world is still 40 meters wide. The next post is about scaling: how a following simulation bubble with an analytic far-field can take this same engine to an Earth-size planet, or a solar system, at essentially the same GPU cost per frame.

The physics is finally on the same side of the table as the engine. From here, it's all about what we build in that space.