Procedural Chinese Landscape Painting in Geometry Nodes Blender 3D
Creating a procedural Chinese landscape painting in Blender 3D using geometry nodes and Voronoi noise.
As usual, you can find the nodes under the MIT license for free use on my Github: https://github.com/IRCSS/Procedural-Chinese-Landscape-Painting-Blender-3D
You can also find an colored example of a model the system generates on my Sketchfab, or a black and white version.
I have already covered some of the basic workflow in geometry node in one of my previous posts. Here, I will cover some of the more advanced concepts used to generate the scene you see in the video. You can gladly reach out on twitter if you have any questions about some of the stuff I skipped.
First of all, here is an example of the style I would like to imitate, a bit more adjusted to the modern aesthetic. I tried this in Houdini ages ago, and while I was happy with the previous results, I wanted something a bit different in geometry nodes.
The main thing to take care of first is to figure out a way to generate the mountains. One typical way is to draw a height map, and apply it on a dense plane. The problem with that is that I dont want to do this by hand.
My main idea was to generate a Voronoi noise, then decide to add a mountain to some of the cells, generate a random height for each cell, and pull those cells outs. Then deform the cells along their height to get the curvy look in the reference.
First, the Voronoi. Generating a voronoi noise is very easy. Best shown in the animation below:
What you are basically doing is measuring per pixel, the distance to the closest red dot. At the half point between two cell centers, is when a pixel (or vertex if you are doing displacement) switches between belonging from one cell to the other.
Blender has a node for generating Voronoi, both in shaders as well as in geometry nodes. However, I wouldnt recommend using it, for a simple reason. The node uses a GPU parallel algorithm which has a limited search area. This means you would get very similar looking cells.
If you think about it, there is an easier way to do the above. If you spawn a bunch of points on the surface of a subdivided plane, then use the Geometry Proximity node, you get exactly what I described above. You can store the distance to the point and the position of the point in an attribute to do things like pulling out the mountain or generate a different random number per cell.
At the moment, the result we are getting is the same as the Voronoi node itself. But if we change the type of randomness from Poisson disk to Random and reduce density, you will see that we start getting irregular cells.
Now we have our cells. They are defined by the position of the points we spawned. Next is to decide which of these cells will contain a mountain. The trick used here is the oldest in the book. If you have a cell center represented by a position, you can take the dot product between the position and some arbitrary vector to convert the center to a float. Then pass this float to a random number generator, which generates a number between 0 and 1. If the number generated is smaller than a variable defined by you, lets call this probability, the cell is a mountain. If not it is a flat area. Blender has a node for this, it is the Random value node, which generates a boolean. You can pass the float you generate from the cell center to the ID to get a unique 0 or 1 per cell. Multiply this 0 or 1 output by the other part of the calculation and you will be “culling” some mountains.
In the image above, you also see me using the UVs of the subdivided plane, to gradually bring down the mountains as they get closer to the plane boundary. This is to avoid a noticeable end to the scene.
There is also the part where you actually create the mountain looking shape. For this we take the distance to the cell center, and the closer the vertex is to the cell center, the more offset we add along the normal of the plane (or up direction). A very handy tool here is the float curve. If you pass on this distance through a curve node, you can control the shape of the peak interactively using the curve tool.
The last part is simply putting all these different components together by multiplying them with one another. The three components were, the displacement creating the peak, the boolean culling some of the cells and lastly the float flattening the mountains close to the corner of the plane.
What comes after this is hours of playing around with noises and shaping functions to get the exact look you want. One of the first things I always try is to add some noise on the surface, to make it look more nature like and also get closer to the wavy look of the reference photo.
Adding Shore Lines and Riverbanks
Now we have our mountain. Next step is to somehow add a shore line that acts as a river bank. You can of course generate just some random Perlin noise height map to get the train. But the patterns generated from that wouldnt feel natural. Rivers dont just end in a mountain, but flow between them. Even for lakes, there is typically a pattern of gradual increase in height between the boundaries of the lake and the mountain base.
Doing stuff like these is already a lot tricker than simply creating mountains in geometry nodes.
What is it that we actually want? We want the ground around the mountain to be slightly higher than the part where the river flows. This is an ideal scenario for a signed distance field map. This map will tell us per pixel, the distance to the base of the mountain. So, to the rescue comes once again the geometry proximity node.
We duplicate the mesh we have generated so far, delete everything in it that is not a mountain.
Then on the original mesh calculate for each vertex the distance to the closest mountain. If the distance is closer than a certain amount we raise the ground. Again using a float curve, we control exactly how the transition between the mountain and the river bank and the river itself works.
I realize I am skipping a bunch of material and procedural texturing stuff. One screen shot it is a debug looking scene, another the mountains are fully colored. But that is another topic for another day, also you can find a tons of tutorials in Substance Designer community, creating insane stuff using just noises and maps.
Of course, river banks are not so clean and perfect. Once again, we apply noise to the calculation above, to get a more natural looking river bank:
Now we have our mountains and we have a river bank. Time to spawn some houses.
Some Housing Tricks
The houses are also all procedurally generated. I won’t have the time to cover in depth how that is done, especially because workflows in the future will probably get better. This type of procedural modeling is one of the most underdeveloped areas in geometry nodes compared to Houdini.
All the houses generation is on one node group, you can tab in and go over it if you would like to know more.
One very useful trick which I would like to cover however is rotating, or scaling instances that have been realized and modified. It is very typical for you to spawn instances then having to realize them in order to modify them individually. Especially in the case of procedural house modeling. It is also very useful to keep everything aligned with global axis, in order to be able to do procedural modeling.
Imagine you want to create a chinese style roof, as I have done here:
If you were modeling this, you would probably take a cube, add an edge loop, then take the top of that edge loop and pull it up. You might add some more edge loops to move away from triangular roofs to a more curved one. How would you do this in geometry nodes? If you keep your houses axis aligned, you can save the pivots of the houses as a per vertex attribute. This information can be used to reconstruct the “local” space of each house. Using this information you can mask certain areas in local space to select only the vertices you need for raising the roof.
In order to adjust each house individually and with variation, you would have to realize the instances. If you want to rotate the houses you got the problem that you have one huge mesh. The trick here is to use the house pivot you have saved as an attribute, and transform all houses first so that the pivot is on scene origin (vector zero), rotate and then offset it back. Here is how that would go:
By subtracting the pivot, you are moving all the houses to sit on scene origin. Then you rotate it:
This is a simple 2D rotation using matrices in 3D space. Last step is to offset the houses back to their original location:
The above workflow is way uglier than it needs to be. For example Blender can expose the transform as a type to us, plus added maths library support for matrices with a bunch of utility functions. This can condense all the above to 1 or 2 nodes.
Waterfalls
Waterfalls are even harder to do yet, but there is a cool new node in town which we can use. The core idea with waterfalls is this. You want the waterfall to start somewhere up there in the mountain, then flow down to the river. Watefalls join rivers, which sometimes join together to flow in a bigger river. How would you go about coding that?
The easy part of the puzzle is that once you have a spline going from the source to the river, you can simply mesh it to get the waterfall. The harder part is to find a meaningful path for the water to flow. Consider a naive way for example, we take a random point at the top of the mountain and connect it to a random point in the river. Then we use the ray cast node to shrink wrap this spline to the surface of our model.
We would have a major problem however, our two random points can be anywhere, so our waterfall and river could potential go up and down several mountains. What we need is to know first of all what the closest path is between the mountain top and the river and second which path would conform the most with how running water behaves.
There is a new node that does that for us, and it is called shortest edge path which is used together with edge paths to curves.
What this node does is to take a series of start vertices and find the shortest paths to a series of end vertices. It actually does everything we need. For example several rivers might join together if their shortest paths aligns. Or streams wont flow to the other side of the model, but find the closest path to the main river.
Even cooler is that the node allows us to provide our own edge cost algorithm. The edge cost is what defines what “shortest” path even means. In a typical situation, if you have an edge with vertex V1 and V2, the shortest is the one where the distance between V1 and V2 is smallest. However, in our case, there is a secondary condition, we dont want the river to flow back up again. Our cost is calculated as a combination of distance and whether the V2 is lower than V1. So an edge would be more costly to traverse, if you have to move up. I also add some noise to the cost to make the water behave a bit more randomly to account for all the cool nature stuff we cant calculate.
As you can see, the algo connects a series of random points at the top to the points at the button.
Bridges
This one was actually quite hard to figure out. I had quite a few moments of “is this even possible?”. Bridges connect the banks of the river. what makes these so hard is that the LTS version of Blender I am on still doesn’t have looping functionality. With a looping functionality you could simply loop over the border vertices between river and land, calculate for each one the closest vertex that is not connect to that land mass, finally run a series of criteria to determine whether a bridge can pop up there. You cant do any of that in Blender, yet. At least not in that way.
If you have also spend a bit of time writing programs for graphic cards, you would know that there is always a way to take a single threaded algorithmn and convert it to one that would be friendly to the single instruction multiple data architecture which GPUs (and geometry nodes) uses.
First step is to determine the border between land and water. This is actually quite easy. We know that when we go from land to water, we go from a certain height to the height of the water. Using the Store Attirbute we can first loop through all the edges connected to a vertex and calculate which edge is the one that is descending the most. In other words the direction of the river in relation to that vertex. Knowing this, we can check whether the two vertices making that edge conform to our criteria of where the border vertices are positioned in the up axis. I displace the vertices that pass our test for visualization purposes here.
My first idea was to spawn a bunch of lines and randomly connect these border vertices with eachother.
Then, for each line ray cast from start to end point and check if there is a mountain on the way. After all no bridge can pop up that goes through a mountain.
Then run some other tests such as kill bridges that are too short or too long and hopefully be left with a few good bridges.
This is actually not half bad. But there is a problem. Some bridges are aligned as a tangent to the shore line where they start. This never happens in the real world. In reality, bridges are usually placed in a way that they are perpendicular to the bank of the river. But in the current way I am doing this, I have no way of knowing whether a bridge is aligned with the landsacpe or not.
Guess which node comes to the rescue? Geometry Proximity!
We actually do know the topology of the landscape the bridge should go on. This is because river banks are always extended from the base of the mountain. So if we do a geometry proximity to the mountains again (or reuse the calculation we already did), and take the verctor which goes from the river bank to the closest point on the mountain base, we have a vector that is perpendicular to the river bank.
We can take this vector to check if there is a river bank along that direction on the other side, using the geometry proximity node. We are placing our end point at random distances away from our river bank along the direction that points away from the land. Then we check where the closest spot on the river bank on the other side is. Combined with other methods we were already doing such as culling bridges that go through mountains or are too long, this on itself gives a very good result. There is one last thing that is missing. Just because the begining of the bridge is aligned with the shore, doesnt mean the end is too.
So we are going to do the same process but from the other side, we take the end point of the bridge, we do geometry proximity to the base of the mountain to get the vector that represents the outward direction of the landscape and if this vector is misaligned with the birdge as a whole we delete this bridge.
Now we are left with only valid bridges. Or at least almost. There are still bridges that cross each other, or bridges that pop up way close to eachother. You can fix these, but I decided to give up here. When Blender adds the looping functionality in future releases, you can very simply loop over these bridges, and delete the onces that dont make much sense.
As usual, thanks for reading, this got longer than intended. If you have any questions, you can find me under any of my socials listed on my website: https://ircss.github.io/