Everything nodes

The problem is that Blender handles objects pretty badly of you go over a certain number. If you want plants and things like that, or in fact any kind of object to go into the millions you´ll have to use something like particles or dupliverts.
Unfortunately dupliverts are not very flexible and Blenders particles are terrible.
For some things you simply have to write a script that writes particles to cache files. Or you can write a script that creates an object and verteices or faces at specifc locations/rotations and then use that object as an emitter.

1 Like

The problem is that Blender handles objects pretty badly of you go over a certain number.

Frequently I ran into problems. This might explain it.
Show me I am wrong: it seems we are stuck on particles to create a large-scale forest scene.

I hoped “everything nodes” gives us a more flexible tool to create and maintain nature scenes.

3 Likes

I think that is correct. However, for large scale forests I find particles to be ok.
It is more problematic if you want large scenes with lots of objects which have to be arranged in a specific way. Blenders particle systems don´t give you any tools - or at least not many useful tools - needed to arrange particles systematically.

3 Likes

For large scenes we also have collection instances. They could tame much details and complexity.

I do not know any way to add a particle instances modifier to a collection instance. And using a collection instance inside another objects particle system also does not work.

But Instancing by verts/faces allow collection instances. Could animation nodes give some variation to these dupli-instances, to make them more useful for large-scale nature scenes?

2 Likes

Yes collection instances especially combined with linking from other files are great as they allow you to render nearly unlimited geometry.
But unless I am unaware of something they don´t give you a way to easily distribute large amounts of objects/geometry.

Now using animation nodes to create an object which is then used as an emitter or instancing by verts/faces sounds like an interesting approach that could be tried.

One infuriating thing with particles is that it is acutally possible to very presicely control which object is actually rendered on which particle. You do this with setting it to render a group/collection and then activating “count”. You can then presicely enter which particle is to use which object. However, this is not easy to automate. You can write a Python script to do this but due to the way Blender works internally it gets prohibitively slow when used with large amounts of particles.

1 Like

I love Animation Nodes, and I have high confidence in Jaques and the BF team to deliver something great.

I really like an artist friendly UI feature from Archimatix, where handles are provided in 3d space to manipulate parametric paramaters. For example, when curves are driving the creation of 3d shapes (through extrusions, or deforms, for example), when you select any instance of that 3d shape, it renders “curve parameter handles” aligned to where they affect the 3d shape, and lets you edit them directly in the same 3d space as the shape they affect.

There are lots of demos of this effect. Here are two examples, showing both curve changes, and geometric changes.

This is contrasted with Blender extrusion modifiers, where the input curve is often dangling around in a random place and shoved off to the side to keep it out of the way. Or Houdini where you might edit the curve in an isolated workspace that doesn’t show the production.

I’m also looking forward to a day where scene level nodes like AN are not hamstrung by the object-instancer limitations. I’d like to be able to instance collections, parent-child trees, and and object results of other node programs. Then geometry producing node programs can become truly reusable blocks for 3d parametric generation, VSE titles, whatever.

4 Likes

Sound like it will have basically the same workflow as Houdini, i’m more hyped about this proejct :smiley:

5 Likes

We already have different mesh data structures in Blender. Afaik, they do not fulfill the requirements I have for a data structure that can be used in a node based context successfully.

Nevertheless, it would be good, if a conversion could be avoided as long as possible.

This one is a bit sad

In practice that would mean that the user would have to create and manage separate Simulation Worlds . […] Every simulation world has a simulation description .[…]
An active object might be modified by the simulation, while a passive object will never be modified. Every object can only be active in at most one simulation world.

This is a bit of a pity, for it means (should things be implemented according to Jaque’s document as is), there won’t be any inter-simulation-interaction (for the lack of a better term), as I understand it.
E.g., no Rigidbodys pushed around or receiving buoyancy-forces by/from a fluidsim.

greetings, Kologe

1 Like

That would depend on the exact sort of solvers that are to be implemented rather than the design of everything nodes, as far as I understand.

Well I guess a bit of interaction is possible this way, however, it’s a one way interaction with some severe limitations.

For example if you would know the surface of a fluid (active in the fluid simulation) you could attach a rigid body spring force to it in the next simulation world (where the surface would be a passive). An object with mass can get pushed pushed around in this rigid body world. Yet, it will not influence anything within the fluid sim. So the fluid is not pushed back by the rigid body object.

can maya or houdini do this? do they have unified physics (rigid + soft bodies + fluid + cloth + …)?

This is not what the document is saying. Objects, that do interact, have to be in the same simulation world. Objects, that are in different worlds, can only have unidirectional dependencies. To support complex interactions between e.g. rigid bodies and fluids, we “just” have to find the right solver. However, that is not the topic of the proposal.

Also see https://devtalk.blender.org/t/regarding-the-simulation-architecture-proposal/6926/3.

Right, until there are more powerful solvers in Blender, these kinds of workarounds are necessary.

Afaik even Houdini TDs do this. Either the interacting object is big or ‘hero’ enough to warrant keyframe animation, or it’s just a ‘slave’ of the fluid simulation, optionally with some re-sim on top of the existing fluid to account for splashes the object may cause, etc. Take that with a grain of salt though because I’m no Houdini TD. As far as I understand unified solvers are still pretty uncommon, I mean there’s Vellum in Houdini but it’s very new, not sure what came before that…? And it’s only handling grains, cloth, soft bodies… I don’t think it has anything to do with fluids.

yes houdini has that kind of interaction , with the new vellum you can hit a wall with a bag filled with sand and hanging from a rope … with also fluids… I dont know

2 Likes

yes it can https://vimeo.com/295779592

1 Like

Is you use only particles, you can simulate everything together. But converting a whole scene to particles would be quite the performance hit :stuck_out_tongue:

Only simulate parent particles and leave child particles to act procedurally where applicable? A little fakery can go a long way.

is a lot similar to the visual blocks we discussed in this same thread long ago :grin: