New from Tangent Animation: Next Gen. Is this Bender?

Exciting times! CAn’t wait to watch Next Gen on Netflix!
By the way - What’s your next project?

When i made that comment i was unaware that Stefan Werner and Luca Rood already added proper support for VDB voxel data, and improved the Alembic support.

So the solution is already there and we just need to wait a little longer.

Guys … Stefan Warner already told and confirmed about this in last Conference … See this at 1.44

3 Likes

Indeed I implemented support for loading OpenVDB caches in Blender, and Stefan added support for a bunch of rendering features required for the production (e.g. volume motion blur). Stefan also implemented OpenVDB rendering directly from Cycles (bypassing the Blender OpenVDB importer).

I joined the team at Tangent at a late stage of production, and OpenVDB rendering was a requirement which had to be fulfilled in a reasonable time frame, therefore the implementation is relatively poor, as there was no design stage prior to the implementation. Because of this time restriction, the importer was simply implemented as a wrapper to the smoke modifier, which reads from the OpenVDB cache. This is not without it’s quirks, and is highly inefficient.

Stefan’s direct implementation was then done to fulfil the high rendering demands, but he will have to fill in the details regarding the code’s production readiness.

So, unfortunately I will not be submitting my OpenVDB code as a patch to Blender, as it is not fit for the real world, and will be unsustainable. However, the code is of course available, and if you want you can make use of it (https://github.com/tangent-animation/blender278).

8 Likes

Well, again thanks for your hard work, making Blender better.
But the fact that it is an unsustainable, inefficient importer makes me a sad panda. :cry:
Based on your and Stefan’s work, how much is there still to do to make it a proper implementation, fit for the real world (patched into official Blender)?
And how big are the chances that Tangent Animation will push it further in the near future (maybe for the next big project)?
I am unfortunately unable to code and dependent on people who are ( i haven’t even build a blender version myself).

Basic OpenVDB integration would be pretty straight forward. The fact that @skw already made an implementation of direct OpenVDB rendering in Cycles is definitely a big plus for this to be further implemented in Blender (though, again, I don’t know how production ready that code is, as it was a last minute thing. Stefan?). As far as I can see, the biggest thing to be done at this point would be to implement the viewport display of the cache without building a dense grid in memory.

I’m not aware of anyone currently invested in this OpenVDB integration though.

Btw, I just realised that all the latest posts here are about OpenVDB. Perhaps we are going a bit off topic…

3 Likes

It’s kinda funny to me to still hear people talking in terms of “when will the industry recognize Blender?” Most certainly, they already do. There is today another professional tool that is being used for professional work, and that tool is: Blender. No, it’s never going to displace other tools in shops that have standardized on them, because that doesn’t make business sense. But a lot of shops out there have standardized their pipelines around Blender, and with very good reason. They’re getting results that are indistinguishable from those produced by other products, as in this case, and they’re reaping the benefits of "cooperative software development," which is driving Blender forward at an incredible pace.

3 Likes

The Teaser trailer is on Netflix:
https://www.netflix.com/title/80988892

10 Likes

Damn cool… this looks lot more promising now than ever! A perfect teaser.

Can’t wait!!!

Sorry if my statement was a bit confusing, but the look / feel of the DoF wasn’t improved - the performance boost of using Stefan’s work of integrating Embree as the raytracing core was what allowed us to use it in production. Render performance was very reasonable, and we achieved excellent results within the sample settings we were using for the overall image.

We’ve found that the transition to Blender typically takes 1-3 weeks, depending on the artist. Mastery of the software takes longer, but tasks are so specialized, that transitioning really isn’t that bad. We have internal training material that we use when we bring new artists on-board (which is on our Youtube channel), coupled with very specific Youtube instruction videos from various Blender aficionados.

As a side note: we tell people to NOT change the hotkeys, even though everyone is tempted to use the “Maya Hotkey” set - it creates too many issues with workflows. As someone who has used almost every 3D software package under the sun (all the way back to Turbo Silver, Caligari, TAV, etc), I always find it best to use the stock workflows until you get familiar with the packages - once you understand the workflows, you will then have a better sense of what will speed up your specific workflow.

15 Likes

Oh Lord. You should tell this again and again and again on every media available! Cheers

What part?

Last paragraph

@twitchmedia I remember from the talk of last year’s blender conference from Stefan Warner that you guys were using amazon AWS as a rendering platform.

Are you still using that?

I am trying to look into it, and while I love the amount of power you get for the price I am finding Brenda a bit clunky for a day to day use. I think probably have your custom in house render management solution to interface yourself to AWS, but can you tell us something about it? I think it would be a very interesting topic for many people (but maybe I am wrong :smiley: )… maybe for another conference! :wink:

Wow! Amazing, real neat!

Thank you so much for your advice, but… what about user prefs, Blender (default), but > Input > Select with LMB ?
Thx in advance.

Any word on a release day yet?

on the netflix page it says “Sept. 7th”

5 Likes

Trailer dropped:

19 Likes