A small blue-green planet covered in dense procedural grass floats against black space, with a lone astronaut in an orange suit standing on the curved surface surrounded by swaying blades and scattered roses.
An astronaut lands on a planet that looks like Earth but doesn't act like one. The ground responds to every step, grass parts and traces linger, flowers bloom where they shouldn't. From orbit it reads as a fuzzy blue marble. Get close and you're wading through an endless procedural meadow rendered entirely in real-time, over a million blades of grass swaying under wind computed on the GPU.
False Earth is the second chapter in a quiet ongoing narrative by Ming Jyun Hung (MJ), a creative technologist based in Tokyo. The first, Drift, followed the same astronaut lost in space, surrounded by a million GPGPU particles standing in for fragmented thought. False Earth picks up where Drift left off: the astronaut arrives somewhere, but arrival doesn't mean answers.
The rendering pipeline is worth studying. Compute shaders handle blade positioning, Voronoi-based clumping, wind simulation, terrain sampling, and character interaction. A stable tile-free pattern is generated via PCG hashing with CPU grid indexing. LOD adjusts segment counts by distance. Terrain uses FBM-based heightmaps with normals sampled in compute. Roses are driven by vertex animation textures with their own LOD and compute-managed spawn cycles. Post-processing runs through a TSL pipeline: Bloom, Depth of Field, SMAA, with a PerformanceMonitor handling adaptive DPR.
Explore the live demo and try the different camera modes (follow, first-person, detached).
- Live Demo: https://false-earth.mingjyunhung.com
- Source Code: https://github.com/momentchan/false-earth
- Author: Ming Jyun Hung (X, LinkedIn, GitHub)