Welcome back to Jaakan’s not-so-weekly update! Last time @bigsylvain taught us how to make sprinklesets, which allow us to give the impression that parts of the terrain are covered with rocks, or grass, for example.
This week was very productive for Jaakan, we’ve gotten a lot done and there’s no chance we can cover it all in a single article. Let’s concentrate on the most polished aspect we’ve implemented: the lighting system.
Let’s go first-person for this, as it’s easier to write this way.
@fasterthanlime We’ve wanted to add shadows to Jaakan for a while, and even in the earliest version (the Ludum Dare one), we had some, faked through alpha-blended black (for shadows) and white (for light sources) textures:
@fasterthanlime Although it looks convincing enough on a screenshot, in action it’s very static. It gave kind of a “painting” feel to the various scenes, which worked well up until scrolling. As everything in the Jaakan world becomes more lively, and since we’re on track to a very atmospheric game (cf. the old soundtrack), it only seemed logical to try and improve the visual aspect of it as well.
@bigsylvain We realized this week we’ve been making games for nearly 15 years! Our first game-making “club”, Luxors, was founded in 2001, back when you were eleven.
@fasterthanlime True! We were mostly using The Games Factory back then. I’ve come a long way since, my short time as project manager of Xith3D really helped me get into 3D stuff. Everybody was trying to figure out the maths stuff on euclideanspace back then..
@bigsylvain And it took me a few years to try out UV mapping and skeletal animations, but now it feels like second nature!
@fasterthanlime Anyway, since then I’ve done a few Ludum Dare, with and without @bigsylvain, one of them was John Q. Adaman, which I kept developing for a bit, and that was I think the first time I used a shader for real-time lighting:
@bigsylvain That’s a vignette effect, isn’t it?
@fasterthanlime Correct. Although it does modulate the scene, it’s very fixed too, could just as well been a big texture. The technique described in the article I linked above is much more sophisticated, and involves multiple render passes. Let’s take a simple scene, without lighting, that looks like that:
@fasterthanlime What we need first is to get a texture of all the things that are going to block light. That’s called the occluders map, and it looks like that:
@fasterthanlime In this demo, all occluders are black, but that’s just a coincidence. They could be any color at all, the important part is that transparent pixels (under 0.75 alpha) let light through, and the rest block the rays of light.
The next step is a little trickier. In the case of spotlights, again described in the aforementioned article, we want to cast a few hundred rays of light in every direction, and find out how deep they run before they hit an occluder.
To do that, we convert polar coordinates (angle + radius) to cartesian coordinates (x, y aka rectangular coordinates) and sample the occluders texture. Once we hit an occluder, we store the distance to the center of the spotlight to the point where the light is blocked into a 1-dimensional texture called the shadow map:
(The shadowmap is shown here on an orange background so it’s easier to visualize).
@fasterthanlime The horizontal dimension is the angle, ranging from 0 (left) to 2 PI (right). The whiter it is, the further the light rays go. On the opposite, darker parts mean the light was occluded close to the light source.
The shadowmap computation happens on the GPU which are massively parallel computing beasts — so it’s reasonably fast. But this technique requires a lot of what we call “dependant texture reads”, which blocks the graphics pipeline and slow us down. Not enough to make this impractical for real-time use, but enough to make us be careful about the number of light sources or their size.
Once we have the shadow map, we can compute the light map, which goes like this: for every point in the area of influence of the spotlight (e.g. 512x512 for a spotlight with a radius of 256), find out the angle of our light vector, sample (ie. read) our shadow map in the right place, then compare the distance to the center with the value encoded in the shadow map. If it’s greater, we’re in a shadow, otherwise it’s lit.
The full light map for this demo (including the sun) looks like this:
@fasterthanlime And so, by combining ambient lighting with our lightmap, we get the fully lit scene. Of course, ambient lighting (the amount of color passed on unadulterated from the unlit scene) and diffuse lighting (color visible because it’s illuminated by dynamic lights) can be adjusted. It doesn’t even have to add up to 1. In this demo, the total is higher, which makes the sunlit parts seem warmer, burnt almost.
@bigsylvain So that’s how spotlights work. If you don’t understand it at first glance, don’t worry — we spent a few hours debugging the whole system and polar/cartesian coordinate conversions are hard to wrap your head around. Should we talk about the sun?
@fasterthanlime Almost! We’re not done with spotlights quite yet. First off, we haven’t talked about the radial fallof. Spots emit light which is attenuated the further it gets from the emission point. That’s a pretty simple computation: just multiply the light contribution by a factor that varies from 1 to 0 as you get further away from the light’s position.
@bigsylvain Ah yes, but if you do it that way then you get an ugly, sphere-like, too-regular/artificial halo. That’s why I spent some time adding slight variations that depend on time (basically, a combination of sines and cosines) so that spot lights don’t look too perfect:
@fasterthanlime Even that version looks and there’s visible patterns that sell out the “procedurally generated” part of it, but diffuse lighting is exaggerated in that gif, to make the effect stand out. Besides, we could use any formula at all, depending on the light sources. Something more scintillating, more rapid variations, variations that depend on something other than time (screenshake for example, with an upset/cooldown mechanism for light sources), the sky is (quite literally) the limit!
@bigsylvain Can we talk about the sun now?
@fasterthanlime Ah yes, the sun. It sounded easy at first - same shenanigans, generate a shadow map, sample it, and we’re good to go right? But the sun is a fickle creature that won’t be tamed so easily.
@bigsylvain I suppose we have to start with the limits? Maybe some readers are wondering why we don’t simply handle the sun as a huge spotlight (almost like in the real world). The reason is simple: see the distance between your room light and your floor? Now see the distance between the sun and the earth?
Now imagine the distance from your room light to your floor (about 2.5m, if we’re being optimistic) is 1024px on-screen. The distance in pixels from the sun to your room would be about…
@fasterthanlime That’s a LOT of pixels.
@bigsylvain Possibly more pixels than were ever drawn in the history of pixels!
@fasterthanlime At any rate, no current graphics card can handle that…
@bigsylvain Maybe by the time the game is out?
@fasterthanlime So the “realistic” / “physics-based simulation” approach is out. But we can cheat, as usual. The sun rays that do reach earth are almost parallel, yes? So let’s just assume that instead of a point emitting light rays in every direction, we just have a line above the screen that emits rays pointing down.
@fasterthanlime Although this model is technically close to correct, it looks wrong. Something is just off looking at a scene with perfectly parallel rays of sunlight. Adding a small spread to the rays makes the scene look less artificial and overall friendlier:
@bigsylvain Just like the spotlights, the sun has a falloff, but it’s linear (on the Y axis). It’s not physically accurate, but it does bring a bit more variety to the scene, getting us further away from the plain colors of the unlit scene.
@fasterthanlime And the final touch that brings it all together is a very discreet grain filter applied to the whole scene - although it’s modulated by a light response curve. It helps with avoiding banding and makes the scene look even more natural.
@bigsylvain We’re using simplex noise to simulate grain: it’s a good compromise between faster (but uglier) algorithms like Perlin, and near-photorealistic grain which would take a huge toll on our GPU budget.
@fasterthanlime And that concludes this update. We started the week with discussions around game design, and we’ve made progress on that front too, but all in all, I don’t regret spending a few days making the game look a lot more attractive.
@bigsylvain It’s always easier to find motivation to work on something that’s already pretty.
@fasterthanlime Last bit of news: we’ve disabled new purchases on our itch.io page. We used to do Early Access, but it wasn’t a good fit for the game - it didn’t have a solid basis we could just add content to, instead we went through 4 or 5 rewrites of the engine, we changed languages, storage formats… and gameplay!
@bigsylvain However, previous buyers of Lestac will, of course, get a full copy of Jaakan when it’s out (it’s the same store page, we just renamed it).
@fasterthanlime As for the soundtrack, there’ll be a new one for the finished game, which might take some inspiration from the Lestac soundtrack, but will hopefully have a better production value. It’ll be sold on Bandcamp as well when the game is eventually released.
@bigsylvain Thanks for following us, and see you next update! Here’s a very old and very rough music draft as a parting gift: