Why Blender Isn't Industry Standard

Yeah for me that’s really the crucial part of the puzzle I’m looking forward to seeing Blender solve.
As of now it still isn’t pipeline viable. Every company that adopts Blender will have to develop in-house tools. As they have with other software they have adopted.
At present I just feel like there are a few more basic features missing from Blender.

With the increased funding however, I see this being remedied in the near future!

1 Like

Ton has said many things over the years that should not be taken at face value. For example, at one Blender conference he told the audience he has no say in what direction blender takes, that he does not have any sway. That was clearly a lie because he has on many occasions continued to dictate from above. Case in point, the whole ordeal with colored wire frames some time back. He effectively shut down that patch because he did not like it. Just write off what Ton says over the years as Ton just being Ton. Don’t always take it at face value, sometimes its just for the sake of image.

Part of his blender is for blender users was also a response used to shut down those wanting left click as default. Look at where we are now, left click select, an industry standard keymap option and an updated GUI followed by “corporate sponsors” from Nvidia, AMD and Epic to name a few.

Without looking at what non-blender users would be interested in, we would not have a lot of the changes that have made it popular. Just the left click select and an industry keymap option alone has done wonders for it being adopted on a much larger scale.

In the Games sector people have already or begun to use Blender as their secondary app AT work, not only because of ADSKs on occasion ambiguous relationship with it’s CG entertainment biased slice of the market and also until quite recently Adobe implemented subscription ‘business’ practice clamped upon Allego’s Substance suite but actually due in the main to an enhanced feature rich development, increasingly intuitive functionality and importantly adherence to standards which primarily has driven a steady rate of adoption. One only has to hop over to Polycount and trawl their boards to sort of gauge for themselves how far in terms of traction Blender has gained amoung industry pro’s, whether AAA or for that matter mid tier shops.

This ^… My guess is that Blender is used in a lot of studios, big and small, but not as a replacement for other software, but rather as another tool in the toolbox.

</Topic Off/> When you go into an interview with a stellar portfolio, I doubt they will disqualify you when they find out you did everything in Blender. Talent is talent and that’s what will get you hired, not the brand of paintbrush you use. </Topic On/>


It would be interesting to know how many studios are currently exploring/using it. It seems like every week lately I hear about another one. Since most don’t really need to announce that they’re using it, I assume there must be more. Then again, Blender users are known for letting everyone know that they use Blender. :grinning:

I saw this one the other day. Edit: Opps, I just noticed this was linked earlier in the thread and I missed it. Oh well.

this is quite tricky, people have different needs , different level of experience and work on different fields…
I think the way it works now is more coherent : BF get feedback from a few experienced , long time users . It’s not perfect either but at least it’s easier to get to goals that is ok for artists and doable for developpers.

What you’re whitnessing now is just, as Zeauro said : 2.8* project needs some polishing and all that stuff moving around will settle after a few releases.
It’s because the development is fast and we see all the the new stuff coming that we get a bit overwhelmed by all this. If development was hidden and we just see the releases , all this would seem more quiet.

It’s also debatable if it 2.8 sould have been postponed until everything is ironed out according to the initial specs (that would mean waiting for an extra year) , or if they made the right choice to release something usable but not polished yet…

1 Like

Can anyone give an example of a pipeline? Is it a fairly standard thing or can it be so widely varied you pretty much have to roll your own?

I have some idea of what it could be but would like to hear some real world examples.

At my studio we may deal with a few different outputs but the basic pipeline is the same. However it evolved over time, and is something I sill tweak constantly. Also client’s needs drive a lot of it. And then, if I am looking to perfect things, or improve with new software it can require refactoring many things.

Rather than look at software, it is probably best to think about what any pipeline needs.

Some way to concept based on clients needs - usually 2D

Some way to realize the design with no constraints from technology. This can be sculpting or high definition polygon modeling. Or a way to read and import CAD models.

Some way to optimize the higher definition model. Retopolgy.

Some way to prep it for transferring details from the high definition model. Usually UV

Some way to bake those details into image maps.

Painting on the model to add more detail.

For characters or other animation assets, a rigging system.

An animation system.

A Dynamics system.

A rendering solution. VFX, Film or Game.

For film and VFX some kind of post processing.

It is quite common that many software solutions are used to get this done. Which is very much true at my studio. Which then also means dealing with all of the interchange issues between apps. For that, we stick with fbx, obj or Alembic mostly.

I can give you a example from a few TV shows I worked on.
The Character department would make a cgi character in Zbrush. This would get reviewed until it was approved. Texturing was usually done in Zbrush or Substance painter, or 3ds max / Vray which was the rendering packaged we used.
The Character was then rigged and animated in Maya.
Why Maya if we were using 3ds max? The entire animation department threatened to quit en masse if the were forced to animate multiple characters in 3ds max. Guess it was that bad on performance.
The character animation was saved out as an MDD (Motion designer Document) Basically a point cache file, and the camera Exported to 3ds max. Except for all CGI shots the background plate were shot with a Red Camera.
The shots were edited and then sent to us. Sometimes they were late which added a bit of stress. The Comp department would track the shots and export a 3d Camera that animation could use for plate shots. They would also do roto if needed.
VFX, Fire, smoke explosions etc were done in 3ds max / Fume/ Krakatoa. The VFX dept would usually render out their own elements for the shot.
I was in lighting and rendering so my job was to load the background plates or CGI shots from animation. Then import the animated / tracked 3d camera. This was to make sure everything was lined up and tracking and for lighting reference. Then import any HDRI’s that were shot on set. Load the character model up, import the MDD file and apply it to the character. Also run a script that created any vfx trails, special character effects etc. Set up the lights and render passes for all the elements. Diffuse, reflection, spec, Element passes etc. And then check all the renders once done.
Then comp would add everything together and add noise and filters so you couldn’t see the shot anymore. The producers would review the shot and make changes and everything would start all over.
Keep in mind all of this was usually going on at the same time. The Character modelers would be finishing up the model, while the animators were working with a Proxy file which was then switched out for the final model. While the comp guys were tracking the shot, I would be setting up a shot with temp proxy files.
I forgot to mention we had custom scripts and tools written in house for doing a lot of the stuff. Especially the import/ export parts or repetitive tasks like making trails, certain vfx elements.


Nice outline.

I definitely think that if you are doing fx for live action it enters an entirely different level of complexity to the entire process.

One thing I did not mention at all was the concept of proxi stand-in models.

And to expand on that, another aspect of our pipeline is the concept of working assets in parallel. Just in general, not necessarily with only stand ins. But asset types that can be worked on in parallel for efficiency.

So I guess I was focusing on the straight pipeline and not the organization and management of it which I see as more or less separate. But of course all these things are taken into consideration when designing a pipeline so it is great that you mentioned it.

Epic Games supports Blender foundation with 1.2 Mil bucks. Ubisoft joins Blender Development fund. Studio Khara switches to Blender as its primary 3D CG tool. I guess there are probably more examples.

Personally I really don’t care how many big studios use Blender in their pipeline. What I do care about is a readily accessible, dynamically developing package that works, with a good sized community online. I think those are the reasons for the above headlines. I first opened up Blender 10 years ago to take a peek, and was pleasantly surprised. Long time Max user I now use Blender as my 3D app and have five students for Blender instruction coming up. Four packaging artists and one engineer. In short professional people from large companies. The software itself is open source, but I am not. I gotta eat.

An example of Blender’s flexibility is Unreal’s development of the Datasmith plugin for most popular 3D applications. Not Blender though. No problem, a quick web search and a download from someone who developed an addon that works.

Even if large studios are not using Blender yet in their pipe, there are a lot of professional areas that are taking notice. No small thanks to it’s effective user base.

You guys rock.

1 Like