Will the ENTIRE WORKFLOW of Blender change after Everything Nodes? (about which I know 0, btw).
What I mean to say is, if I’m just starting to learn Blender, should I Not? Will all the current tuts be rendered obsolete/invalid after it hits?
Thx.
Will the ENTIRE WORKFLOW of Blender change after Everything Nodes? (about which I know 0, btw).
What I mean to say is, if I’m just starting to learn Blender, should I Not? Will all the current tuts be rendered obsolete/invalid after it hits?
Thx.
I am not totally certain if they have worked out the UX completely yet or not. But it is my guess that it would remain integrated.
Think of it this way. When you set up materials in Cycles, you are accessing certain channels. Try this.
Open a scene and turn on Cycles. In the material panel for the default cube click on “Use Nodes”. If you open a Node window, you will see it added two basic nodes. Now you can go in and add something like a texture input in the Material panel. Not in the Node Editor Window. And the node will automatically appear and be connected. You can keep doing this to a degree. But then to get more control and do more with the material, you can go into the Node Editor and start making more advanced node trees to do more cool tricky things.
Maya works like this for everything. And is dependent on if you have History on of off. But basically everything you do on the front end in the interface and panels, is reflected in nodes. This means that you can keep working as you would using normal panels and menus. And the nodes get quietly connected and routed in the background. Then for more control and advanced routing, you can dig into the node editor.
For this reason, Maya has had quite an advanced “procedural” modeling capability for a number of years. Just that people mostly did not know or use it.
I am going to assume it will work something like this.
of curse you should continue learning it, let’s say you’re using a lot of modifiers in your workflow, with everynode the workflow will be the same but you’ll have a different ux/ui ( wich is connecting nodes and not having a list ) and that will improve your workflow in many ways
You should probably familiarize yourself with nodes if you’re not already.
Right. INCREDIBLE, AMAZING stuff! (I’ve never used Materials in B btw ). But - does this mean (it’s called EVERYTHING nodes) that now all of modelling, texturing, animating will be done “Node-y”? THAT was the intent of my question, and what is sounds like to me, just… from the name.
Yeah, Procedural stuff, to me, is the COOLEST shit in the world Even better than “node-y” would be node-y COMBINED with like, raw code like Python or whatever Maya’s got - MEL, right? ABSOLUTELY UNBELIEVABLE!! So… will B end up outdoing even Maya?
…and AFTER that, do we even bother with the mouse in the viewport any more? This is what I want to know. It’s never really felt “artistic” or like drawing to me… you know, just as intuitive as pen or pencil on paper - otherwise they wouldn’t need degree courses to teach this stuff
At the end of it all, I would assume though, that conventional modelling Will, Still be needed - Everything Nodes will not Do Away With It ALL? Can anyone tell me?
Seems people are really getting ahead of themselves. Everything nodes will require figuring out proper data structures and how to connect them. Which means it may mean changing some fundamentals inside Blender. Which can take a long time.
What we currently have is particle nodes, as far as I’m aware. That’s a long way from Houdini style mesh and volume nodes.
What was mentioned was
Probably to have things really usable, we would need a way to have access to some mesh and curve data.
That will be a really long process. That is a WIP that will take years.
So, if your intention was to use Blender, next year ; modifier nodes will probably not be there, yet.
But that does not mean that everything that you learned will dissappear when nodes will be there.
You will have to learn how to arrange bricks. But you will not need to relearn the meaning of those bricks.
Anyways, 2.8 series brings a new UI design that is partially accomplished and in phase of confirmation. Several things have been completely remodeled between 2.80 and 2.81 like File Browser. And that will continue during several releases.
So, if you decide to test 2.81, now ; you have been warned that some areas will be a lot different in 2.82.
Yes. You will still need to create vertex groups, face maps, make selections and hide/reveal them.
When mesh is an organic and incredibly complex character and was created from an import ; there is low chances that you can figure out how to modify it by using only nodes and numerical values.
Developers are creating gizmos for lots of tools to use edit mode tools with tablets. It should be closer to use a pen now than how it was in 2.79.
Houdini evolved to integrate gizmos and traditional way to call polygonal modeling tools.
Poly modeling and procedural modeling are complementary like sculpting and poly modeling.
If you take time to search for procedural modeling addon, you can already find some addons for doing procedural modeling in Blender.
It will take a while for them to get to it for sure. And work it all out. And what defines everything nodes is that they have to work out not making it separate systems. And to do that they will have to standardize all of the systems. And in doing so it would be logical to look at Maya and even better Houdini.
In Houdini you can model and do all of your work in a familiar interface way. But underneath nodes are created. So at anytime you have two ways to access editing your assets. Just depends on how much fine control and customization you want.
It would be best at this time to answer the OP as simply. It will not likely change how you work with Blender in a fundamental way from how you are familiar. So yes. Keep learning.
If that changes, so be it. But I’d say it is a good bet it won’t - too drastically.
The idea of using nodes is the notion of flexibility and more logic so that the systems are better controlled in a smart and customized way. Considering that now some functionalities are completely linear or blackboxes (they work with a WYSIWYG design), it means that getting nodes gives lots of power.
Such as for example modifiers are now totally linear thus very limited on how they can be used. Also particles are quite “blackboxes” because the only thing you can do is to change the simulation properties but not get inside the particle system and make it do whatever you want.
As of now, one example I can think of, is having a subdivision modifier on a mesh (or for example 500 meshes in the entire scene), but this modifier has to be clever setup and set the level of detail based on the camera distance, only models really close enough can get high detail but others a bit far can be set to 0.
This example I mentioned can be written now in Python as an operator but the entire point is not to use any custom coding at all but rely on the node system.
No not really, everything nodes will expose the existing functionality as nodes. The existing functionality of Blender will have to be redesigned to make it easier to access by nodes, but that will happen to an extend.
Nodes is a terrible UI for handling complex structure which is why it has fall from grace as a visual coding paradigm but it still very good choice for much more simple structures.
Its doubtful that modeling through nodes is a wise strategy , modeling tends to create massive complex structures. But for animation systems and physics node are ideal for dynamic creation of assets. I saw a recent presentation of Houdini and eventhough in VFX was very impressing, when it come to modeling workflow with nodes it was a very sad experience. But it is always good choice to have an alternative.
Nodes cannot replace a well designed UI , scripting and block based visual coding but they are ideal for semi complex structures as long as you can avoid the spaghetti monster.
So no it wont change the workflow radically, also this is a huge project so any change will be very gradual. AFAIK the focus right now is just particle systems.
So dont throw your existing modeling tutorial to the garbage bin just yet.
Fun Fact: The correct terminoloty is “Flow Graphs” and they called that because back in the old days, coders would use Flow Charts/Graphs/Diagrams to design code flow of execution on paper because spending time on computer was so expensive being well prepared was paramount. The technology goes as far back as the 1920s and it was during the 1960s , before the invention of many modern textual programming languages that it show a rapid rise in popularity of visual programming languages when early GUIs became a posibility. However maintaining visual coding languages prooved much more of a challange because of the inherity complexities of GUI design so textual programming languages quickly became the norm. However modern visual coding languages that follow the lego block paradigm has seen unprecedented growth with notable example Scratch a visual block programming language for kids with 10s of millions of users worldwide making it by a vast margin the only visual coding language able to compete with big boys of textual coding.
I just hope that transition will make possible to code custom modifiers, while giving the option to use them as “stack” (actual modifiers) too.
And imagine using a single modifier which is made with a combo of others modifers, that can be shared with other users like we already do with material custom nodegroups.
modifiers are tricky because mesh data can be rather huge and python does not do well with heavy computations.
I imagine some accessibility with nodes will be provided by breaking down modifiers to more simple ones but still you wont be able to make a true modifier in node or python anymore you can make a true shader. In those cases because of the very heavy computation it may be better to use C. GPUs do offer geometry modification through shaders which may offer a quick way to get things done in C without messing with Blender source but I seriously doubt python and nodes will be replacing hardcore advanced modifiers like subsurface subdivisions any time soon.
Of course the story may be very different on low poly models. In any case dont expect anything close to that before 2021, the focus right now for Blender is physics which needs an overhaul and that is also an area that everything nodes will be focusing on.
This would only apply if Everything Nodes
tries to be nothing more than programming in a visual form (as opposed to something like Cycles nodes, which according to Brecht was designed for artists and was never intended to try to emulate such).
In my opinion, node UI’s work best when they don’t try to be programming and work as a mid to high-level interface (even though there could be some lower-level nodes in the mix). I do recall William saying he wants to see the base of Everything Nodes be low-level, but with the offer of a wide selection of higher-level group nodes.
You mention a Scratch-like UI, the reason that works is because, by the looks of things, it’s just scripting with colorful graphics, easier visualization, higher-level functions, and better safeguards against mistakes. I’ve only seen images so it might be more than that.
I’m yet again going to make use of the good ol’ Houdini example, because it is a damn good one. Houdini has a language that you can use to manipulate mesh data (and more actually, as it’s being expanded in each version to handle other data types) and performance is really good. I believe it is compiled on the fly, but don’t quote me on this. In any case it is as usual the example to follow.
Mostly because this approach to modeling as a “system” where you can produce the result based on rules rather than pumping vertices manually whenever possible. But indeed, certain things can be done procedurally (especially when it comes to scaling into large scenes) and others manually (creating fine tuned and custom content when needed).