Mitsuba 2.0

Well, didn’t see this announced here yet and since I was a pretty big fan of Mitsuba back when it was released I thought I’d share a link to the released publication of Mitsuba 2.0

I don’t think it’s released yet but hopefully there will be a renewed interest to develop a plugin for Mitsuba afterwards.


Thank you for the information! :slight_smile:

1 Like

well, it doesn’t seem like many people care but maybe when a working demo is shown it’ll get some hype.

1 Like

Mitsuba always been more like an academic test platform than a renderer. It can do more than ANY other renderer could, but previous version was extremely slow and lacking of almost everything usually needed for production use

Which is why I’m hoping maybe 2 will get a bit more love this time around. Still, I have a soft spot for Mitsuba.

Mitsuba’s developers have no interest in it becoming fast or production-focused. It exists as a testing ground for academia as a stable bed for examining new methods of light transport.

I meant love from the Blender community, it was a fun Rendering engine to play with.

Anyway, being it an “academic playground”, it can bring news in the render technology that maybe can land into Cycles one day. Who knows, let the professors play… :wink:

1 Like

Thanks for sharing, @RealityFox. I also hope Mitsuba 2 will be accessible from Blender in the future. The first version was a great renderer with lots of methods to choose from.

I love Mitsuba, great to see more work on it!

1 Like

And there it is. Unsurprisingly mind blowing.


Thank you Troy for sharing.

Intriguing, but I guess most of the mentioned techniques are mainly interesting for scientific research purposes, not for taking common renderings to a new level in terms of realism and/or speed, although of course I hope some of the algorithms could benefit Cycles as well.

Yes, very happy it’s out. Maybe we can get the old plugin to work with this.

windows build available?

Yeah hopefully. Some of those next-gen research/academic techniques and algorithms may find their use in production pipelines. Afterall even pathtracing itself begun as an academic research. Or not?

1 Like

At least for now, there was nothing in the video that talks about how image reconstruction for projector setups can be usable for general CGI and VFX. Does it help with caustics, does the user have to input the data manually ect…?

The same is true for what appears to be a demonstration of custom kernel code? Can this code be generated automatically for any scene that might be done in a DCC app or is it more specific, and what is the visible advantage over traditional engines?

Look closely at Enoki, which is at least one of the most significant developments.

If you trace some of the authors, you’ll get a much better idea of the stature of the project. In particular, Wenzel is worth tracing. :wink:

I agree… looks like a very convenient library for vectorization.

@Ace_Dragon I think the image reconstruction is mainly to be an example of uses of differential rendering.