Continuation from the Geometry Nodes topic by xrg
Point distribute seems to work fine in my testing. To avoid intersection, you have to set the distribution type to Poisson disk and set a cartain distance.
You can add the roation nodes at the start instead of after every point separate.
There is a node randomizing the scale with min and max both set to 0.1, is that intentional?
Buildings2.blend (1.2 MB)
Thanks to your help, I’ve been able to get total control on my city generation. Thank you once again.
For some reason. the rotation doesn’t work on the tall buildings. If you change the radian node value you can see that 90% of the tall buildings are not affected. So I just placed the rotations nodes at the end of the script instead, per collection and it’s working great.
To answer one of you previous question, I did put a tiny amount in the scale attribute because my buildings were enormous once distributed. I also discovered that the geo nodes take into account the data information too and it’s very annoying. I don’t know if it’s a bug or a feature. If I make a building in Blender and export it in OBJ to texture it in Substance, when it comes back in, it will have a 90 rotation on it and all my buildings will now be rotated when I used them in my script. If I apply the transformations, then the rotation goes to the delta. So even if my building rotations are all at 0, the buildings are still rotated (because it considers the delta too). So the only way to fix it is to go into edit mode and rotate the polygons themselves. I really badly wish Blender would support working Y up because we have to constantly deal with rotation issues when we have to go though other software, like Maya, Houdini and Nuke, which are all working Y up.
Idk if it can help, but when you export your models in OBJ, FBX, COLLADA, etc… you can specify which axis is “up” in the export dialogue.
You can then play with the export settings in blender AND other softwares in the pipeline until you get the right setup.
Carthesian system since 17th century → Z
Einstein → Z
3d Print → Z
Cryengine → Z
Unreal → Z
Blender → Z
Maya → Y
3DS Max → Z
Modo → Y
zbrush → Y
Sketchup → Z
In facts even autodesk is not consistent with up axis across products, and game engines / low level api use different conventions too.
Exporting Y up is not a problem. The problem is when you import back. Your model will be exactly at the same place but will have a 90 degree rotation in X. So if you have an expression on that object, it’s not going to work. So you apply the transformation but that 90 will now appear in the delta and if you use geometry nodes, you still have the same problem with rotations.
Maya has the option to use Z up or Y up. In my case, I work in the film industry and we are all Y up. Gaming industry: Z up, VFX, Y up. In compositing, we us zDepth and zDefocus. There’s no such thing as a yDepth or yDefocus. And it’s not just geometry. The normal maps, motion vectors and point position pass won’t work in Nuke because Z up. And in the film industry, it’s 99.9% Nuke. I did a clip on my youTube channel about that.
Glad to help! Your city looks great.
I noticed something wierd with the rotation of tall buildings but thought it was just because they were rarer.
I don’t generally work with other software so not the right person here for your problem
Bob, first of all I love your tutorials.
Secondly, YES, my main major gripe with Blender is that it doesn’t have a preference for Y-up. Coming from Houdini this is a silly little thing that drives me insane (that and the fact that the Delete key doesn’t delete!!!).
Can you share your building set up blend file (sans buildings of course)? I’m actually getting ready to do a short Houdini/Blender presentation on my channel about the new Geo Nodes and I would love to show my Houdini peeps how to properly set up a city like you have.
In my tests, the Blender instancing is nowhere near as responsive as in Houdini. In Houdini I can easily instance thousands of heavy geometry objects and the Viewport is very responsive. In Blender it seems like just a few dozens will make it feel sluggish. Are you running into a similar issue on your end?
Thanks for the comments You can remap the delete key if you want. My entire keyboard has been remapped because the default settings were driving me nuts (23 years of Maya muscle memory…). For the file, just take a look up. You will find buildings2.blend. I’ve instanced thousands of the building in my city (the image doesn’t show it all but it’s much bigger) and had no issues.
Thanks. It looks even better after compositing. I will make a clip on my YouTube channel about it but it may take a while because I have to wait to get the permission. And we are still working on it.
Cool…so you haven’t made additional tweaks to the Blender file already posted (I mean besides adding in your own geometry)?
I just put the rotations at the end (per building size, so copied three times) instead of at the beginning.
If you are using an older version of blender (most probably 2.83 LTS), the delete key does delete…in the viewport, but not in the outliner.
Newer versions of blender don’t have this “inconvenience” and the delete key works as “expected”.
Ok so it turns out that I completly forgot about the “attribute compare” node which allows greater than in 2.92, also the tall building rotation seems to be fixed somehow.
Buildings2.blend (1.2 MB)
Ok…if any of you would like to continue to play, here’s a challenge that I’m curious to see if there is a solution.
Let’s say that you want to limit the general region of where the tall buildings are. In a city for instance, there is bound to be a higher concentration of skyscrapers in the middle section, and then the buildings around it and on the periphery would get shorter as office buildings and high-rises give way to smaller office buildings and warehouses to eventually single family homes and shopping strips.
Is there a way to define a center of the plane, and then procedurally create a vertex map or gradient, and further bias tall buildings to primarily populate the central region, mid level buildings to primarily populate the area directly around it, and low buildings to primarily populate the periphery?
You should be able to generate a gradient using the attribute proximity node and use that for the distribution
Paint a weight map on your object and set the name of you weight map in the density field
Oh Bob, but that is so non-procedural. I’m a Houdini guy!
Love your most recent tutorials. Keep it up!