This week in Soil development, that's my working title by the way, started off hot! After fixing an off by one bug in my dirt pixel structural integrity weighted averaging kernel, I got to work adding some dynamism to the world by adding a day night cycle in my sky shader.
The Cycles of Life
Soil is a game about the emergent complexities of life, and so the most important question for me to answer is "How do I make an environment that is condusive to interesting simulated life?" I came to the conclusion that the basic unit which drive evolutionary niches are cycles. So I began pondering and researching what forces are at work in our world to drive evolution. Our planet is made up of a vast number of different cycles, each varying in their period, regularity, complexity, and ubiquity.
The Water Cycle
We learn about the water cycle in elementary school. H2O condenses out of the air and pools on the ground in sizes ranging from puddles to oceans where organisms rely on it's viscosity to gently suspend themselves and their food against the downward tug of gravity (or not so gently in the case of waterfalls). Water seeps into the ground where organisms suspended in the solid matrix of soil particles can suck it in and spit it out at their leisure. Eventually, water evaporates out of these mediums back into the air where it can move long distances in the turbulent atmosphere before falling back down to earth as rain, where the cycle beings anew.
The Nitrogen Cycle
Nitrogen is everywhere! The Milky Way galaxy is about 14% nitrogen. Our atmosphere is 78% nitrogen. Unfortunately for most life on earth, this all comes in the form of the useless N2 molecule. While N is a highly reactive and versatile element, N2 doesn't react with almost anything due to the strong covalent bond between the two atoms.
However, somewhere in the deep past, a procaryotic cell stumbled up
on a means of producing a molecular machine that, when fed with energy in the form of ATP, would grab an
N2 and a handful pseudopod-ful of free protons and electrons, splitting the N2 and bonding those free nitrogens
to hydrogen, producing ammonia, NH3, a more bioavailable molecule.
It was only about 3 billion years later that a species of ape evolved the capability to use this newly available nitrogen to form the brain that named these molecules nitrogenase enzymes.
Nowadays, lots of organisms have nitrogenase in them: Cyanobacteria, which live in the ocean in big clouds of algea as well as in colonies in the roots of cycad plants; Rhizobium bacertia, which live symbiotically in the root nodules of legume plants; Azotobateria, which just hang out in the dirt all by themselves, to name a few.
To complete the cycle, bacteria, fungi, and plants together chain molecular reactions from Ammonia (NH3) through ammonium (NH4+), nitrite (NO2-), and nitrate (NO3-), and eventually back to their original form of N2. Thus the cycle begins anew.
The Day Night Cycle
It seemed a bit premature to be implementing the nitrogen cycle in a simulation that is currently operating at the granularity of dirt pixels. So I asked myself what is the most basic cycle that I can implement now?
My answer was the light cycle. As the earth spins about it's axis and orbits around the sun, there is a cycle of the flux of the sun's radiation being absorbed by the earth's atmosphere and surface. When that flux is high we call it day day, when it's low we call it night.
Building the Sky
My current rendering consists of 2 draw calls to the GPU. Both calls cover the whole viewport with a triangle, then render the 2D graphics with a fragment shader.
If you're unfamiliar, a fragment shader is a block of code that the GPU executes for each pixel on the screen to determine what color it should be. If you want to learn more, check out The Book of Shaders and Shader Toy to get an idea of what they're capable of.
My first draw call renders the sky and the second blends the simulated material pixels ontop according to the material's opacity, air=0%, water=50%, dirt=100%.
To implement a day-night cycle, I decided to start by graphing the sun's position in the sky as a function of time. After a brief flirtation with projecting sun's position onto the celestial sphere from the perspective of a 180 degree fish eye lens looking towards the equator from a parameterized latitude coordinate, I settled for the tried and true x = cos(t), y = sin(t).
I'm using a signed distance function to mix between yellow and the background white-blue gradient. I also mixed in some continuous noise to the gradient to make clouds. And to hide the sun at night, all pixels below the horizon y value are set to brown.
To seperate day from night, I can use the sun's height sin(t), The range of sin(t) is -1 to 1, so values >0 are day and <=0 are night. I can use that as the mixing factor between light blue and dark blue for the sky color.
I also wanted the sky to brighten closer to where the sun was rising first, so I tweaked the night day mixing to be brighter for pixels closer to the sun using the same distance function that colors the sun yellow, but with a different threshold.
In my opinion the two most beautiful things about the night sky are sunsets and stars. To add the sunset, I needed a value that is high when sin(t) is approaching 0, which is of course cos(t)! This is perhaps intuitive given that the sun crosses reaches it's horizontal maximum and minimum when it vertically crosses the horizon. If we raise cos(t) to an even power, we can get a function that spikes right as sin(t) crosses 0. Desmos is a great help at times like these, as well as Graphtoy for graphing shader specific functions. Now I can just mix the sun and the white part of my white to blue day-time sky gradient with a orange color based on cos(t)2
Tackling stars required a bit more creativity. How do you map from an x, y, and time coordinate to a rotating star field of randomly distributed white points with varying sizes? Well I'll leave that as an exercise to the reader. I'm quite happy with my result!
The sun's position is passed in as a uniform to the shader along with a brightness and light direction vector. I will be able to pass the sunlight vector easily to future simualted organisms which might do photosynthesis or absorb heat. This also opens the doors to lightning. I would like to explore radiance cascades because it's the hot new thing in 2D global illumination, and bioluminesence is definitely on my todo list. But that is probably a ways off.
Cycling Through Dependencies
So I did all that on Monday. So what have I been doing Tuesday through Friday? Glad you asked. I started playing with how I was going to generate the initial patch of dirt in Rust before kicking off the first compute shader. I quickly came to the conclusion that I would very much like hot reloading to avoid the cargo run wait, test, close, change, cargo run cycle.
Google pointed me to Robern Krahn's blog post describing how to use the hot-lib-reloader crate. I followed the instructions, and it crashed. For some reason wgpu internals would panic when loading the library through this crate. Not knowing much about library linking, I asked Claude Code for help. It pointed out the issue may be how I was managing my program state, with wgpu objets littered throughout my monolithic state struct hierarchy.
Two days, several questionable decisions, and an uncomfortable confrontation with my misconceptions about the Rust borrow checker later, I had an abstraction layer around wgpu and was only loading my functional logic through hot-lib-reloader. It still crashed. Is this what they call AI indused phychosis?
Halfway through this process I had an epiphany about what an Entity Component System is for. A few years ago I worked on a game prototype using Bevy, a Rust game engine built entirely around ECS. I found programming with Bevy to be quite delightful, but I didn't quite understand what ECS was doing for me at the time. I've found that encountering the problem a system was designed to solve is the best way to understand the point of the system. Starting with the solution and looking for the problem just doesn't click as well.
I decided to take a peek, and it turns out bevy's latest version 0.17 just shipped seamless hot reloading support!
I strongly considered switching to using the engine. However, I've been truly enjoying the process of building my game without an engine. If I were building a 3D game, or a more traditional sprite based 2D game, perhaps I would switch. But the majority of my game logic will be written in compute shaders, which Bevy doesn't have first class support for yet, and my rendering pipeline is intentionally very different from how Bevy's is designed.
To give it a fair trial, I wrote a bare bones Wgpu Bevy plugin so that I could dispatch my compute and render passes to the GPU directly. It was okay, but it was a lot of work figuring out what engine features I could and couldn't use while keep Bevy's renderer out of my dependency tree.
I concluded that if I'm gonna be reading documentation, I'd prefer to read about aligning my programs closer with how GPU designers implemented things, not how a game engine implemented things (though reading Bevy code has been quite educational in it's own right). The nail in the coffin was that the compile times were just killer.
Thankfully, Bevy is the most modular piece of code I've ever seen and you can use bevy_ecs by itself! So now I'm in the process of rewriting my code to use ECS, allowing me to easily pass my GPU state around as Bevy Resources and get rid of the ugly struct hierarchy of my current program while getting hot reloadable code for free in my homebrewed winit, egui, and wgpu engine.
Playable Demo
This only works on the desktop Chrome and Firefox browsers.
Use the brushes with Right click. Left click and scroll are zoom and pan controls. You can change the brush size, shape, and material. Dirt has a green overlay to represent it's structural integrity