Modify Realized Instances and Mesh Islands Individually In Blender Geometry Nodes

Shahriar Shahrabi
12 min readDec 5, 2023

In this post I will cover how to transform (move, rotate or scale) meshes you have made in Geometry Nodes after you have realized them as instances. I will also cover the case of moving, rotating and scaling mesh segments/ islands individually at the end of the article.

Here is the demo scene that demonstrates the points below. You can find the final mesh on my Sketchfab:

Let’s say you want to spawn a bunch of grass or stones in geometry nodes and distribute them in your scene. A typical workflow would be to spawn a bunch of instances on your mesh. What if you want each instance to have variation? Or maybe you want to animate each individually?

You can always transform the instances at any time, as long as they are instances. But you can’t really modify the vertices of these meshes individually. Since these are instances, whatever modification you apply to one, will carry over too all other examples. So you can’t add unique deformation to each blade of grass.

To do that, you would have to realize the instance. Once that is done, Blender packs all those instances as one huge mesh. After realizing the instances, there is no different between Grass A and Grass B. It is all just one mesh, with one coordinate system.

As long as you want to do a simple modification, this is not a problem. For example you can apply a noise globally on all the blade of grass, and this insures that each grass looks different.

However, what if you needed something more specialized?

For example, in this scene, I would like to spawn a bunch of sea grass. My first step is to spawn a bunch of lines, convert them to cylinders and scale them in a way so that they look like blades of grass.

Now I would like to scale the top of the grass in the x axis, so that it gets thinner and thinner as you go to the top. I can do this in the original mesh I am coppying around, but I would have to spawn a mesh and I can’t work with curves. For various reason, I would like to stick to curves, so I need a way of scaling only the top part after I have already spawned curves as instances and realized them.

My aim is to apply a transformation like this:

position.x *= position.x * smoothstep(position.z, 1.0, 0.0);

The smoothstep or Map Range in Blender will make sure that my blade of grass will have a thickness of 0 at the top and maximum thickness at the base.

Let’s just apply the above and see what happens.

Ok that is strange! The reason why this is happening is that when we multiplay our position by our scale, we are always scaling around the pivot. In the case of an instance, this pivot sits on the origin of the mesh. But after Realizing the instance, the pivot is somewhere else. That is why we are getting this weird shape.

So what we need is to first translate all the meshes so that they sit on Zero or origin, then multiplay with the scale, then translate them back to the original position. I have already covered this in a different article, so I will go over it fast.

We need to save where the pivot was for each instance, so that we can access this information per vertex after realizing them. Like this we know where the base of the grass was, and move each individually to the origin.

You can store the attribute either before Instance on Points node on the Points (vertices) or afterwards on the Instnaces. The Position node on the Instance context will give you the instance pivot.

On the image above, at the left side we store the pivot of each instance per vertex, then use this information to move each blade of grass so that their base is sitting at the origin. Now if we multiplay the X axis with our map range, we will get the correct scaling. Once we are done, we add our pivot back to the positions of vertices to move the grass back to where it was.

If the above is too complicated, consider reading my Matrices for tech artist cheat sheet post.

Now for something new, what if you have rotation on your instances? If your instances are rotated, this transformation alone is not enough.

With rotation, you will get skewing instead of a nice looking blade of grass. Why is that the case? As we are scaling the blade of the grass at the top along the x axis, we are scaling along the global x axis. However if the instance was rotated, the local x axis would not be aligned with the global one. We need to first align the two axises.

In other words, we need to unrotate the mesh, apply our transformation, then rotate the mesh back to where it was. This is similar to how we moved the mesh to origin, but for rotation.

First we save the rotation of the instance for use further down.

Instance Rotation node gives you Euler angles. You can save that as a vector and work with that later, though I decided to work with Quaternion instead.

A small rant on rotations in Blender.

In the version of Blender I am in right now rotation can be a bit confusing in GN. The main problem is that GN and material editor doesn’t have vector 4 as a type, only vector 3. This is very inconvient for various reasons. I hope it changes in the future. If you try to store the Quaternion as a Named Attribute, the constructor will take a vector 3, but that is just because the node editor doesn’t display vector 4. Quaternions are float 4 data type.

Rotation in geometry node can mean two different thing, it can mean Euler angles or Quaternions. These are simply two different ways of representing a rotation. This is on its own not a problem, I actually hope they will expose a third rotation representation type, rotation matrices. The problem is the naming and visualisation can be very confusing. You have nodes like Rotation to Quaternion and Quaternion to Rotation, which makes it seem like Rotation is its own data type and is different than Quaternions. In reality those two nodes should be called MakeQuaternion and SplitQuaternion.

The point of the rant is, things can get confusing, but at the moment whenever you see a purple input for rotation it is a vector 3 and it is euler. If you see a pink input it is a vector 4 and you need a Quaternion. There are helper nodes to covert between them or even construct them through various ways.

After we have transformed our mesh to origin, we can now unrotate the mesh. That means applying the reverse rotation of whatever rotation our mesh had, so that it becomes axis alligned again. We then reapply the rotation once we are done with our transformation.

And this works! We get our mesh to origin and axis aligned, do whatever modification we want, then move it back to its original position. If you want to, you can of course also save the scale instance information and do the same with scale.

Last but not least, how about if you wanted to have a unique ID for each instance of grass to modifiy it in a unique way? This is also simply done. You already have the rotation and location of the instance. You can run them through a noise function to get a unique ID.

Alternativly you can also store the unique instance ID the same way you save the pivot location and rotation. Then when you are procedurally modifying your blades of grass, you can always use that as a seed to insure that each blade of grass is modified in a different way.

That would look like this:

First saving the Instance ID as an attribute.

Then whenever we need to add variation to our transformation, we use the Instance ID as a seed to create random procedural effects. In this case I am displacing the grass around using Sinus and adding the instance ID as phase offset to the wave so that each grass is deformed differently.

Transforming Mesh Segments Individually

So what do you do when you never had the sub segment of the meshes as instances? Let’s say you get a mesh that looks like this:

You have one mesh with all these different sub parts. You didn’t spawn them in Geometry nodes, so you don’t know where the origin is. How would you go around rotating these in “local space”?

First, consider how you would solve this problem without geometry nodes. You would probably go to edit mode, and use the “seperate by loose parts” functionality. This would split the mesh so that each inter connected part of the mesh becomes its own object. Now you select all the objects and set their origin to be the center of the volume. You can now rotate the fish around their center of body.

In Geometry Nodes, you can do the same thing. There are different ways of doing it, but I will cover a simple one.

First we need a way to iterate over each sub segment of the mesh and isolate the different parts. For this, we can use the newly added for loop (Repeat Zone) and Mesh Islands node.

The mesh Island node gives you two information. First is how many mesh islands does your geometry have. Second, what mesh island does the current vertex/ edge/ face you are processing belong to. This is indicated by an index pointing to the mesh island.

Let’s set up a simple system where we iterate through all islands, and delete all vertices that dont belong to the specific island.

The above does that, though there are some complications. What we want to do is to go over each island, and if the vertices don’t belong to a specific island index, we delete them. This baisically means that we can now isolate specific islands as a selection. You might notice though that I am using two extra nodes. One is to save the Mesh Island’s Island Index as a named attribute, the other is to run the Island Count as a attribute statistic.

Theoritically whether we access the island index per vertex in a repeat zone or not shouldn’t matter, so we don’t really need to capture them as named attribute. But if you don’t store it as an attribute, it doesn’t work. The reason is that in this setup as you delete mesh islands in the repeat zone, the indices shift around. By capturing the attribute before the Repeat Zone, you have cached this infomation and you won’t evaluate it during the loop. As an added bonus you can debug stuff easier in the spreadsheet with a named attribute.

As for why we need to run the Island Count through Attribute Statistic, I am no quite sure. Similar to the mesh island index, if we delete things in the repeat zone we can have problems, so we need to cache the total number of islands before we start the loop. The core problem though is that the repeat zone expects a single clear number as for how many iterations it needs to do. This number needs to be the same for all vertices/ edges or faces you are acting on. For some reason the Mesh Island Count is a field, which means there is a number for each vertex/ edge/ face. Either I am misunderstanding what that number does, or this is a bug. Either way, running it through an attribute statistic makes a single number out of this field and also cache it. You can simple take the biggest number among the field. It should’t matter because all the entries are probably identical.

Now we need to figure out where the center of the island is for each segment. You need to decide how to calculate that, because there is no “true” center for a geometry. One way is to take the center of the volume, which would be the average/ mean value of all the vertex positions. To get that, you can use the attributes statistics node again.

The Attribute Statistic of the Position node gives you the center of that mesh island. This is the pivot you want. Now just like before, you can apply that pivot as an offset to move everything to the origin, do some transformation like rotation, then move it back to where it was.

Keep in mind to apply your Selection mask to whatever node you are using. This includes Attribute Statistic, and the Set Position Nodes.

So far we are only acting on a single mesh island. But what if you want to act on all of them at the same time? That is a rather easy thing. With this setup instead of zero we need to add the index of whatever iteration the for loop is in at that time. We need to count the index ourselves. Here is how you would do it.

Add an integer which you are always counting up. Then use this as input to the Equal node and that is all you need to do. Now each step of the Repeat Zone is acting on a specific Mesh Island. You can use that index as a unique ID to do different random modifications to each island.

At its peak you can do so many things with this idea. For example all the fish and grass animation you see in the video are done in Geometry nodes using this method.

Lastly, what if you want to use these pivots in the game engine? There is no reason why you can’t have these animations done in a vertex shader. This is called pivot baking and it is a very popular method, as it allows you to render many small animations in a single draw call. Your main problem is getting the Attribute information from the FBX to the game engine. One way is to write a costume export python script which exports all the mesh attributes as CSV. This is similar to what houdini does. Then on the engine side you would need to read that info, and populate your vertex buffer with your pivot information. If you like to use the standard file formats like FBX and nothing else, you can also bake the pivot into UVs.

For Pivot baking in UVs, store them as two 2D vectors on the Face Corners. Then apply that modifier. You will see that you have two new UVs holding your pivot. This can be exported as FBX and imported into Unreal, Unity or Godot. There is a small gotcha with this. Engines sometimes clamp UVs or Repeat them so that they remain between 0–1. If your mesh pivots was above those values you might have to normalize them, then renormalize them in vertex shader in engine. You can read more about this here.

This was a short post meant as a documentation as a workflow. Thanks for reading as usual. You can find me on various socials: