Hard Mesh addon for Maya

New addon to generate excellent booleans for Maya
Check this link. Impressive

I think the only thing Blender really needs to get close to this (out of the box) is the ability to auto-merge vertices that would otherwise create overlaps when a bevel modifier is used. Blender already makes good bevels, but it just needs that extra push in terms of functionality.

Of course, if we then start talking about dozens of objects, that may not possible in a convenient and quick way until the modifiers go node-based.

indeed I miss a good bevel so much

This is great development. Is there anyone working on things like this and MeshFusion for Blender? Having to plan for operations like this is one of the main reasons I still find modeling to be quite stiff when dealing with more complex shapes.

Speaking of MeshFusion, this is similar, but different: https://www.youtube.com/watch?v=iHd5Tl6AnPg

I agree that Blender should have WELDING in a modifier which is very useful after Booleans(removes artifacts), Screw modifier(E.g create dynamic Cylinder with Screw and weld loose parts) as well as some other cases.
Thanks to help and guidance from Campbell and Luca, I slapped together something that in normal circumstances I would be ashamed to show: http://cgstrive.com/SS/2_3222017__general__08a8.jpg It’s a hijacked Build modifier as I did not want to create overhead of working with 10 files and then maintaining it. It’s not pretty but functional. Perhaps useful to someone. I hope to see native elegant solution integrated, one that merges to center not nearest point.

I mentioned this plugin recently in the thread below. I did put a sample showing the results that such tools produce (the sample -a sf spaceship- is made by Groboto, a previous stand alone app of the authors of Meshfusion). As I said in the thread below too, such tools will become necessary, very soon, for any serious 3D modelling app and Blender developers have to focus on the matter. The great obstacle of achieving satisfying results by the use of booleans is the problem that arises regarding their seams (they can not be bevelled-chamfered easily). In the case that this problem will be solved they will become a tool of central significance for hard surface modelling, without doubt.

Very nice example!

I agree that tools like these should be a primary focus for modeling tools in the future. Not only do they speed things up, but I’d argue that they open up a whole new shape language in 3DCG-work (as your own model demonstrates).

I really want see this type of tool before other things that are important for the devs in the GSOC.

The mesh fusion solution is really simple, in concept but not in the work, they made a remesher of all the objects to have the same density in all the mesh and then they unified all with a “bevel mesh” and make a dirt merge between all. Will be great see some type of solution to this in blender.

All people that talk to me want to learn modo only for this feature.

Let mention here that Meshfusion has some very important new features (compared to Groboto). In Groboto you could not put different thicknesses in the various boolean bevels-chamfers of the same object, while in Meshfusion you can do this with impressively precise results. Something that constitutes an essential progress as to the modelling abilities the tool has.

Btw, Yamanote… thanks for the good comments regarding the model… : - )

DcVertice… the thoughts of the people you talk are very understandable: such a tool opens a whole new dimension in modelling. I too have thought to learn Modo (for the same reason, solely). But it is a very expensive app (for my budget at least) and so, I try to do the work with Groboto to this time. : - )

The main problem of blender modelling tools are the fact that some of them are highly destructive. I hope they can sort this out once they work on making “everything nodal” as they planned. In my opinion every modelling operation should be able to be a node that you can add or remove at any given time and output always the final model in real-time with a non-destructive workflow.

I’m sorry, but I can not agree with that.

The “everything nodal” is a good way and idea for some works, procedurals, some mechanicals,… But like we can see in Houdini is a really hard way to make the assets creation, and this is the common work of blender users. In houdini you don’t make art in a “natural way” like in blender or Zbrush, you work like in bureaucracy. And the destructive work is a little problem if we compare with to work in a program like Houdini that few people use to make assets and have a really clear target, the technical artists. And this happens for a reason, is hard to work in that way and you decrease your production in all your creations only to improve the workflow in 1% of the situations.

I don’t believe that blender foundation or any company could mix a comprensive/natural pipeline to modeling assets with all-node pipeline. Because they are really different. If they can mix this two worlds could be perfect but I don’t think that because is a really change in the paradigm of the program and they will betray all the users that they have now.

An example of that is PixaFlux, is a good idea to make textures but it can not be krita or photoshop. You can not mix both worlds in one. For this reason few people use houdini.

The idea of nodes can work with traditional tools if it was done in a way that allowed their usage to be optional (and for certain systems) rather than having your whole scene be based on it.

So you have nodes for

  • -materials
  • -particles/hair/instancing
  • -modifiers
  • -animation constraints

But then have the traditional tools still in place for almost everything else (and even for the four areas mentioned, it wouldn’t be impossible to have a legacy interface option that people can use instead).

So, I’m actually a Modo user (mesh fusion is what this plugin is emulating) and I have to say, what you guys seem to be missing is the complex fillet edges and auto-magic blending that’s generated. They moves with the Boolean shapes on the surface. Mesh Fusion, which was the first to do this trick, is much more than just live Boolean unfortunately. It’s not at all impossible though but it’s just more complex than just merging verts.

Yes, this is very true. And again, like Modo, the nodes are optional. However, Modo’s node graph is pure hell.

The main issue (if we are having a discussion about nodes) is that there is no attention to the flow of data. This is very similar to the node editor in Maya where you have wires coming from and to other nodes and sometime to and from it’s self!

In a good node based workflow, you need to have a clear path of modification. “This node makes the geometry, then this node modifies it and then this node modifies it more” and so on and so on. the noodles/wires go from one node to the next showing a clear path of data modification and branching if there is any.

This is not at all the case with Modo and Maya. They are all over the place. And a lot of it is because they are showing a visual representation of the code which doesn’t always make sense visually. For instance, with modo and Maya deformers are stored in a list in the main geometry node. The list can be re-ordered but it’s just a list. The nodes that modify the geometry just have wires going in and out of the entries of that list. So, looking at the deformers, there’s no visual representation of what order they are in. They just float out in space isolated and all connect to the main geometry node without any indication of their relation to each other.

It’s even worse for Partical nodes in Modo where there is absolutely no thought behind how to represent things visually so that they reflect the flow of data. It’s all willy nilly.

A good way to nodify other areas of Blender would be to model it after the way Cycles does things.

In a sense, you have a core set of components with its own special input/output socket type (like shader nodes in Cycles). For modifiers, the core of any node tree would be a string of modifier nodes with their own socket type (alongside a new node that would allow you to combine the results and an input node type that represents the base mesh). Surrounding that would be your texture nodes, math nodes, and vector nodes that you use to manipulate the values in the modifier nodes.

Same with particles, you have a core that contains a ‘particle’ input/output socket, the same with constraints. You have very clear rules on how to string the sockets together and where the connections ultimately need to go.

If BF have problems to implement the new layers system I don’t want to know the hell that must be the work to implement the node that you talk.

Anyway, this thread is not about the everything node, is about hardsurfaces tools like mesh fusion. And I want make a question, Will we see a complete implementation of the OpenSubD code? because actually is something only to show in renders and you can not to collapse or export a mesh with the solution. I don’t see the reason for that.

As far as I know Meshfusion works by nodes too. Imagine some booleans that work with real time feedback, having auto-filleting, where you can turn every object participating in the process to whatever boolean condition you want (by just changing the boolean operation kinds on the node map)!

Such a thing would be a real modeling revolution in Blender, without the slightest doubt! : - )

But it’s different from a node program based. And the critical part of the tool, of this technology, is the boolean/fillet/subd/remesh system, not the way to organize the meshes.