Adobe acquires Allegorithmic

You’re talking about matching Zbrush in terms of performance; I’m talking about matching it in terms of output. It doesn’t matter that Zbrush can handle a face sculpt down to every lovingly crafted skin pore when you can just paint those skin pores on with a texture just as well. It doesn’t matter that Zbrush has advanced tools to sculpt mechanical-looking objects, when you can just model those same objects to look just as good at vastly lower polycounts.

You’re talking about future upgrade potential; I’m talking about the current state. Even here, I think you’re vastly underestimating the complexity of the task at hand. One of Substance’s most important selling points is its ability to generate textures on the fly from a tiny definition file. Just adding some nodes would indeed give Blender users something to work with, but to truly match Substance, a full clean-room compositor rewrite is necessary so that it can be spun off into a high-preformance apache-style licensed library (like Cycles) that could subsequently be integrated into proprietary software. For this purpose, the current compositor is useless as it has too many contributors to be relicensed.

Actually I’m not talking about performance only, rather the performance aspect was just one of the components that sets Zbrush apart. Due to the pixols, the displacement of a surface results in far finer detail and form manipulation. To say “we can just paint those details on” means its not really about the actual mesh itself, but the texturing. If its just about output by textures, even Blender can do what substance is doing, though perhaps not with ease or procedural generation.

For sculpting alone, not texturing, Blender will find it harder to match zbrush than it will substances in that regard. Matching 3D Coat and Mudbox would not be as difficult.

If we are going by output, which you aluded to with zbrush, to fully match substance output, users simply need the procedurals and the nodes to mix and manipulate them. I believe nodes can be created to match the most used found in designer. The greater challenge, imo, is the workflow as it relates to substances. For painting, all you need to do is be able to paint the output of those materials onto a surface. The most clever use of substances I have seen so far is how they are set up in game engines, with sliders for various effects such as snow, dirt…ect and these are generally linked to the range of a noise pattern or the intensity of a texture.

Of course I am focusing only on the nodes as it relates to materials themselves. A smart material just needs to be the final output of a material node.

At the end of the day the the means of achieving both are not worth arguing about since its just a difference of perception and opinion. Both areas need improvements regardless.

I mean, yes, you’re not wrong, Zbrush can handle more and finer geometry, but what’s the point if you can’t ship it in a product, or directly rig it for animation? All that detail Zbrush handles always needs to get baked down to various types of textures before you ship it or integrate it into a shot.

The video you linked shows off an impressive shader to be sure, but I’d never use that as it performs like molasses at render time. You can’t really beat a simple texture lookup. More importantly, while the Cycles shading system is very cool, gather and scatter ops like blurring are just not viable with it for performance reasons. A finite resolution texture compositor is necessary.

2 Likes

Depends on what it is you are doing. Sometimes clients even like to have the high rez models on hand which later resulted in 3D models getting printed out. The sculpt detail matters just as much as being able to pump out detailed textures as well.

One of the nice things about Zbrush is that you can take the high detail (which being able to achieve it makes sculpting feel a lot more natural) and remesh it with very little loss in that detail while also achieving a much lower polycount in the procesh. We simply cannot give up on achieving that kind of detail simply because it might later get baked down into an image map.
=)

Blender models are just fine for printing, and will be for a very long time. Multires (which is finally fixed now, yay!) will happily handle a mesh with multiple tens of millions of polys, and with a couple addons you can auto-generate retopo meshes just as fast, that are just fine for sculpting (though unfortunately not rigging).

There’s probably no printer on the planet that can preserve the kind of detail zbrush handles :stuck_out_tongue:

Well I better tell all the people I know who work in the film industry to give up on using zbrush, to tell their clients the high detail sculpts don’t matter. I’m sure they will listen. They can drop everything and just go with textures! :wink:

Of course I am being sarcastic. Once you work with zbrush, do the work that requires that attention to detail and the ease in which it is to achieve it, then you will know how important it is.

Anyways, I don’t want to turn this into a zbrush thread rather than a substance/adobe/blender thread. We can agree to disagree as to what is more achievable with Blender.

You’re not listening. I’m not saying high poly sculpts don’t matter for workflow. They are most certainly great for artists. I’m just saying they don’t matter much for final output - i.e., the final pixels on the consumer’s screen.

Having a way to generate textures on the fly with an integrated compositor on the other hand does matter, because you can’t very well ship hundreds of gigabytes of textures with your game (and even if you could, you shouldn’t). It doesn’t matter for animation/movies, but you’d still want it there for interoperability.

No we haven’t. Unity just runs a headless blender and exports the contents of the blend using fbx.

2 Likes

I was referring to what they say in their own manual. “Native imports” of .blend files. Not saying you are wrong, you aren’t, but that’s how they have worded it. I’d be curious to find out if the GPL licensing would be a road block as it relates to actual native support.

Importing objects from Blender
Unity natively imports Blender files, supporting the following:
All nodes with position, rotation and scale; pivot points and names are also imported
Meshes with vertices, polygons, triangles, UVs, and normals
Bones
Skinned Meshes
Animations

Behind the import process

When Unity imports a Blender file, it launches Blender in the background. Unity then communicates with Blender to convert the .blend file into a format Unity can read.
The first time you import a .blend file into Unity, Blender has to launch in a command line process. This can take a while, but subsequent imports are very quick.

The way .blend files are written, internal data structures in memory with metadata, the source code is practically the only way to document the .blend file format. They’d have to clean room, black box reverse engineer the .blend file format, possibly for every Blender release, to avoid running afoul of the GPL.

the gpl isn’t the problem. there wouldn’t be the need for reverse engineering. just looking at the code is allowed with the gpl. :slight_smile:

the problem is that more complex .blend files depend on a lot of blender functionality (like modifier evaluation,…). so it would be a huge amount of work to correctly interpret blender files. it also is a moving target.

it’s the same with other programs like maya. there is no way some importer could correctly handle all the stuff in it.

you overthinking it, I can look at Blender source how it works with Blend file and make my own source that will look nothing like the original code which is prerequisite to prove that GPL was breached. GPL also covers only code not data.

Going around GPL especially in this case is a walk in the park if one wants to. Once for example embeded code in the blend file, python scripts that is, was considered Data thus exempt from GPL , then it was changed as covered by the GPL because even embedded it was still code. Nonetheless the data itself is not covered by GPL or else none would be able to use Blender professionally without releasing all his assets as well.

Codewise I think GPL covers reverse engineering too but I am not sure which version. Or maybe I am wrong on that , I am not sure.Its kinda meaningless anyway when one has access to the source. Reverse engineering is a huge pain.

Plus why BF would stop anyone from importing blend files, that makes no sense at all.

See current Google vs. Sun Java case for why it’s such a slippery slope. Unity just wouldn’t be willing to risk it.

Also, if you’re a programmer studying GPL code in order to replicate some functionality that’s risky behavior because you may copy code and not even realize it. You’d still be potentially “infecting” your private, closed code with GPL and put your company at risk of GPL non-compliance.

See https://docs.unity3d.com/560/Documentation/Manual/HOWTO-ImportObjectBlender.html

Unity natively imports Blender files. This works under the hood by using the Blender FBX exporter.

I agree, that “native imports” is indeed an unfortunate choice of words.

1 Like

Just use their wiki page on the Blend file format, no source code involved at all. Beyond the basic header and file blocks, the data contained in the blend file is self-defining thus handing over the blend format on a silver platter.

As I understand it (IANAL and all that), just looking at code whose function you want to replicate makes it a breach of IP. This was the case at least with duplicating the IBM BIOS back in the day. You had to hire one group of people to write a spec of what the code does, then another completely separate team who write code to that spec. If the programmers see the code at any point, it’s legally Game Over.

Foremost its dishonest to take GPL open source and use it as knowledge to create closed source. Problem is that not only the license cannot stop you not even copyright can stop you because ideas cannot be copyrighted. Obviously we talk here about code that looks very different to the original, if it does not its a moot point.

I do not think that GPL is practically against this even on ideological grounds. The idea here is to keep the source code open, not its ideas. Source is something specific and directly usable in some way. An idea needs a lot of work to manifest and we do not copyright ideas because differentiating between ideas is a nightmare and free movement of ideas is essential human activity.

Now sorry I am not buying copying without realizing it scenario. Your code may look similar, true but that is not enough. If you actually copy code then you are too careless not only you violate the license but also the copyright and you are in double deep trouble.

I’m actually getting a fair bit of traction with using cube and sphere projection for being able to apply a texture to the object and then bake it to the UV map, and some artful use of control maps, as well as using empties and on the texture coordinate node to speed up how the textures are applied.
After that it is just faking photoshop layers with some python scripting, granted I know it will never replace substance, but blender does have most all of the core tools built into it that we need to do this. right now the most stable flavor I’m using for texturing is an early eevee alpha that still mostly feels like 2.7, but that is due mostly to crash avoidance. 2.8 gets REAL PISSY about texture projection for some reason.

People forget that this is an extremely niche industry. Just because it’s in our circle of interest doesn’t mean that it generates tons of cash. Autodesk makes most of its money on Autocad, Flame, etc. As much as I hate them for killing XSI I knew it was unsustainable. Autodesk bought Alias for what amounts to pocket change compared to how much Facebook bought Instagram or WhatsApp, the same for Softimage. In general the industry runs on tight margins.

1 Like

XSI was sustainable enough in my opinion and there was a lot of growth potential for the software as well. Autodesk just didn’t like having too many development teams, their ultimate goal was to start moving everything towards Maya, which means they can get away with only paying for one barebones development team. The attraction of XSI was to get a few developers out of it, while also killing off the competition.
Ton pointed out long ago that only a small margin of revenue goes back into R&D with Autodesk. The less competition there is on the market, they more they feel secure in keeping that, even perceptively minimal, revenue chain going.

That said, you highlight the reason why so Adobe, Autodesk, Foundry… ect are so desperate to turn a product into a service. As a service, they don’t even need to improve on the software or invest time and money into it, the user will have to continue to pay monthly if they wish to continue using said software.

This is where Blender really shines.

2 Likes