Fixing normals after Mesh to Curve - Curve to Mesh

So, I’m still working on buildng my pprocedural city and have hit a road block in creating block generators.

I started out with a building generator I made following this tutorial: https://www.youtube.com/watch?v=59PeIGmZQdY

It takes user inputs for depth, height and width and creates a single building with those dimensions and instances a wall mesh from a collection onto the vertices. Its all based of generating a mesh cube/dividing the vertices and instanncing on the cube’s vertices.

Screenshot 2024-12-01 at 14.57.18

Here’s a screenshot of the full (messy) node tree.

The tree uses the generated cube’s normals to determine the instance index/rotation/find corners/edges etc and place all the carefully organised elements from a location in the correct places.

So to set about building a block I created a new tree which instances cubes along mesh lines which are scaled based on the plane the node group lives in, and divided by a constant 3m to ensure they are always scaled to fit the dimensions of the mesh walls to be instanced.

I calculate the x annd Y vertices based on their lengths divided by 3



These vertices are derived from the original instanced cube.

However I wanted all my cubes to have random variable heights. As the cubes are instanced, the only way to do that was using Scale Instances node placed after instancing the cubes.

This was all fine, I created a random value to scale in multipples 3m so get an evenly distanced scale.

However, because the height is being randomised after the cubes are instanced, I had to leave the Z vertices at 2 for top and bottom because they would not be spread evenly as the edges of each cube are different.

Plugging the realised output into my building generator worked as expected. However only generated the first floor and roof as there were no vertices down the vertical edges. Still it innstanced everything in the right place - corners, faces, rotating instances for the sides.

So, in order to subdivide my vertical edges evenly, I realised the instances selected only the vertical edges, converted mesh to curve, and resampled them at a 3m distance.

This gave me exactly the sort of vertex structure I needed to instance my walls.

(ICO’s instanced on points.)

So then I converted the resampled curves back to mesh and joined them with the original realised mesh and merged by distance to make sure they become a single mesh island.

However when I plugged it innto the building generator things didn’t quite work as expected.

The ground floor and roof still work as their normals come from the originally realised cubes. However all the middle floors - from the resampled curve are all facing the same direction.

What I think has happpened is that converting the mesh to curve and then back makes all the normals face the same direction, so the calculations my node group has determine instance index/rotation/maskinng isn’t workinng because the nnormals are nnot facing the correct way according to the cube object.

This would make sense as once the curves are converted back to mesh they have no faces.

I’ve tried fill curves, but as my curves are not flat on the z axis it doesn’t work, ad using profile curves creates an unwanted edge justting out of the edge of my cube/building.

Does annyone know how I can ‘fix’ the normals of the edges onnce they’ve been converted to curves and back?

My blend file is here: https://www.dropbox.com/scl/fi/28hluxgvccggsgmu0z4z6/Procedural-Block.blend?rlkey=dqvhv0c4iphmuxvf0zu1p1r4e&st=2z4trnzn&dl=0

Have you tried using the Capture attribute node to transfer over normal data, maybe even from the original faces? The docs have a very similar example.

(Also the rest of the project is looking very good!)

Hi, thanks! I’ve been working on this for a while now and I feel I am so close to getting it to do what I want it to do!

I tried the capture attribute and can get the face info from the original cubes. However I’m not sure exactly how to feed it back in. I’ve tried an option using select nearest surface to get the normals and that recommends pputting the output into an Align Euler node and feed that into the rotation of the ‘instance on points’ node.

However the ‘instance on points’ rotation is where the instance rotations are applied on my building generator. These are all calculated from the cube’s (correct) normals, so pplugging these in mess things up further unfortunately.

I also tried plugging the output of the capture attribute into a separate instances to points node and then tried to convert those instances to points to drive the building generator, but alas that didn’t work either

Hello!
If I’ve understood correctly, trying to rotate the instances to face the “captured” normals messes up the rotation of the correct roof/ground floor assets. Would it be possible to completely separate the two geometries, each with their own, correct rotation and only the points they need, and then just use a simple join geometry node?
Also, points unlike vertices don’t have normal data, so converting anything to points deletes normal information.
(I’m away from my main machine at the moment so I can’t post screenshots, will try to add them later)

1 Like

So I managed to find a solution and now it works! Just all the fine tuning to do!

In the end, instead of instancing cubes and randomizing the heights then dividing the lengths, I instanced vertical curve lines and randomised their heights. Then after instancing I could use a fixed length resampling, and instance my cubes on the instanced curve line, essentially arraying the original cube.

Its a bit like your suggestion. I realise the arrayed cubes, merge by distance and then delete all the interior geometry leaving me with a perfectly deviided cube - albiet with no top or bottom.

So I generated the cubes again in the way I had done ppreviously then deleted the sides so I had one geometry with only the sides, and another with only the topp and bottom - merged those together and it all worked perfectly!


1 Like