Geometry Nodes

I do agree it was a harsh bump in the road, but sometimes you do not know of the fundamental user issues of a design until you actually begin to put it in practice (ie. the original design being overly technical compared to how people expect node trees to work in Blender).

While being able to follow tutorials several years later has an appeal, there is a risk of a development model that forced a design to be ‘locked in’ for up to 10 years once the initial commit is made (which would force the devs. to start applying bandaids and workarounds once a limit to the design is reached).

With fields, it means the node workflow is fairly consistent with what we have for Cycles and the Compositor, which I think by now was worth the change.

3 Likes

Unfortunately its a 1-way thing… (GN->Shaders) best you can do is to transfer the W, scale values and reuse-coordinates as attributes and try match the generators.

Kinda like this (Using bounding-box to generate “object-coordinates”):

Would be cool if you could have reusable “texture” node-groups that you could use in the shader editor and GN editor.

3 Likes

Hmm so these textures can’t be fed into the geo node tree? That’s the ones I’m talking about, not shaders

That’s for the deprecated “Sample Texture” node which no longer exists… I haven’t managed to find any node that accepts texture inputs in Blender 3.0… :man_shrugging:
“Textures” have been marked as “deprecated” for a couple of years now, so will be interesting to see how things go.

1 Like

You can make it work with legacy nodes enabled

2 Likes

I don’t know why I was under the impression it was possible to use the same proc textures for GN, modifiers and shaders…

Well we need that functionality, there should be a way to create a procedural texture once and use it in any part of the program that can use textures as input.

6 Likes

Agreed. Biggest gripe from 2.8 onward was this dangling “deprecated” textures thing which only works for some things and then this massive disconnect to “materials” which now you could not use for displacement, you had to use textures, but the material displacement had nothing to do with that… huh??, etc… the gripe-list goes on.

Consolidating the two things seem really important to me.

Edit: Even tho I would like to see something “better” replacing textures, I just wanna say I’m already overcoming a lot of my gripes with GN in version 3.0 so don’t wanna sound ungrateful for all the progress that’s already happened. :+1:

3 Likes

Yes this has been planned for some time. I fail to find a recent task on d.b.o though… I’m not sure whether this is supposed to become a separate “texture nodes” environment or not.

1 Like

Ok, for my current project I’ll just generate the texture somewhere else and bring it into Blender as an image sequence (though having to do this every time I need to use the same generated textures in different places will become very annoying very quickly :sweat_smile:)

One more question, can I delay, or control in any way the animation of instances from GN? Let’s say I have an object with an animated shapekey and I need that animation to be played only when the instance reaches a certain scale for example.

What is missing is someone to make playlists of tutorials that are valid in a given version of Blender, this will ease the learning curve.

1 Like

This is the frustration that people are talking about.

Realtime composition nodes should be able to generate textures, I bet.

(by rendering multiple camera / sampling depth etc you could even dump out normal

1 Like

Setup I shared before can be animated:
animTexture
Here keyframing the W value… provided the coordinates and generator scale and Ws match it is doable… the issue is just that you’ll have duplicate “texture” nodes in the GN editor and the Shader editor, which is a pain.

2 Likes

Right now Blender lacks solvers, so you can’t have something play when something else happens. The best you can do is map two things together, value to value, regardless of what you could call “events”. It’s indeed a big limitation, but work is underway if I understand correctly. :slight_smile: Also I don’t think you can control shapekeys from inside Geonodes, but what you can do is copy your shapekeyed object, bring it in the geonodes tree (object info node), capture its position attribute and pipe that attribute into a set position node on your main (identical) object. From there you can modulate the position vector (vector multiply) to control the shapekey strength, and drive it from whatever place you like in the geonodes tree. This is how I would go about it

6 Likes

I wonder if one could use python to translate a shader node group to geometry nodes, and update it every time a change is made…

I don’t have much experience with scripting. Would this be possible?

EDIT:
Just thought this through a bit, and it doesn’t make much sense. Certain texture coordinates would be broken, because there is no geometry nodes equivalent for some of them. How would you go about getting a specific uvmap since they are acquired differently in both systems?

2 Likes

Think it is a good idea and would be a good stop-gap.

Similar to setup I posted above - I did the coordinate system in GN using bounding box then transferred it to Shader to use it there… similar can be done with curve-parameter to map UVs… so provided you have a clear way of linking coordinate systems then the translation between systems should work…

I was wondering if you could separate out the concept of generator nodes which you can reuse everywhere… just something that acts like a “texture” node-group containing a subset of GN and Shader nodes which takes in a 4D coordinate, then custom inputs and outputs… then design to reuse them in future grease-pencil, compositing, or whatever nodes.

1 Like

As far as I know they are in the design stage right now

Sorry, I know its like 2 weeks later and there is a fix to that particular case, but in going through the torture of supporting cyclic-multi-splines in my own GN-groups I’ve had an insight into why this bug existed in the first place and why I think I it could be fixed even more (and why it may be tricky):

It’s all got to do with the curve parameter:
image

Suggested general fix would be to have an option/node have any curve marked as cyclic have an end-point generated which is “linked/aliased” to the start-point of the curve. The curve parameter would then end where it starts and subsequent extrusions, etc. would face in a consistent direction, but discontinuities in tilt, radius, etc. would become more apparent.

Things like Curve-to-Mesh, Sample-Curve, Set-Position, End-Point Selectors etc. would need to know that the curve is “linked-cyclic” and deal with it in their own way, so that’s additional complexity to the solution maybe… plus I’m sure there would be more system impact I’m not aware of.

Currently have enough working parts to attempt a GN-group workaround for cyclic curve parameters but I don’t expect it to be a general fix… more likely it will be clumsy to work with and have limited application outside of my own GN-group internals.

Hopefully this gave someone else some insight into some other cyclic-curve-related glitch or bug they may be experiencing.

Edit: Hmmm - seems like the curve factor already does what I suggested, even in 3.0:


… just the tilt doing weird things then?

3 Likes

Hey, thanks for going into detail. Curves still confuse me. A lot of times I get into this hacky territory where I don’t know if that’s how it is supposed to behave or if it’s just me not seeing the obvious.
Hope this will be somewhat straightened out in the long run.

1 Like

How do I use the normal of faces to align instances on points?

I’m using dual mesh to get a point in the middle of the face to put the instances on but I also want to align them to the normals of the original faces.