Hello there! I have put together a custom build of Blender 2.68a to add raytraced Indirect Lighting. I would really appreciate it if anyone who has been wanting this feature would test it out for me and post results here. That will help when trying to get this fix/addition accepted into future official Blender builds.
I put together a rather small simple scene for basic testing. Here are some images:
Yes, this is for Blender Render, I assume that is what you mean by BI (Blender internal?) It is now called Blender Render, perhaps it was called Blender Internal before? 2.68 is the first version I’ve actually used.
I didn’t realize so many people ran their own build of Blender, that’s great! I just dropped the patch in the same skydrive folder in the first post.
I did try an old build of Blender which had Raytraced Indirect Lighting, but the results were about as accurate as Approximate, maybe the colors were slightly better but not what I expected anyway… seems like the sampling was getting reversed normals and such. I had to jump through hoops to get my normals facing the right way and bake out point clouds to visualize what I was getting in Houdini. The code could be optimized to do away with all the normal flipping that happens when you use some of the built in functions which deal with setting up the ShadeInput based on the intersect. I end up simply undoing the flipping after each flip because those functions blow away any history of the fact that they were flipped by overwriting the flippednor variable. I would imagine a lot of code could be done away with by simply not doing any normal checking/flipping and just using the geom normal. But for now I am favoring maintainability over speed.
As for speed of the images above, the single bounce Indirect is just around 2min 30sec at 32 samples. The 2 bounce at 8 samples took 3 minutes. I would like to add a feature where each additional bounce can use less samples than the one before. Maybe something like a “bounce sample factor”, like 0.5 means each additional bounce gets half the number of samples. Also note that the number of samples is misleading, because under the hood it is really doing max_samples = samples*samples, I felt lied to when I saw this. :’(
As you will see in the patch, I went ahead and cleaned up the old raytracing code to be more maintainable. My build is running slightly slower in terms of raytraced occlusion than the sanctioned version, which probably has to do with the fact the code is easier to maintain for the trade-off of being slightly slower. In my test of doing nothing but ambient occlusion with 128 samples, my times were ~2:30 for the standard Blender and ~2:56 for mine. I am compiling with MSVC2008 and not building/using OpenMP, not sure if either of those factors has anything to do with that. Please share your time tests as well having only Ambient Occlusion checked. Thanks!
My main interest in Blender Render is the baking functionality. This feature is so essential! I think the need for baking in Cycles and or to have Blender Render maintained for the purpose of baking is highly important. Basically I chose to use Blender in my production because it has great baking functionality and excellent modeling/uv tools. And of course because it is open source and free. But in spite of being free, I was looking into some expensive options, and not very impressed. I kept coming back to Blender which lacked a couple features I want, but had 99% of what I need… I figured I’d just add that 1% myself. This patch is half of that 1%. I also plan on adding more baking modes, such as “all lighting” (no textures). But not sure about public interest on that one.
When it comes to raytracing at least, that’s one of the reasons why Brecht abandoned the render25 branch and started work on the Cycles engine, BI can get very slow if you need a clean raytraced image and it will take major work throughout the code to get the speed to where it’s like Cycles (the last thing BI needs is its raytracer getting slower).
In the past year or so, I have thought of BI becoming a comprehensive legacy-style rendering solution akin to renderman (instead of trying to focus so much on making it a comprehensive raytracer), this means making use of things like point-base GI, radiosity (a bit better than the implementation available in 2.49), a boatload of different specular and diffuse shading models, improved SSS, a better node system, more features that can work together, ect… It might actually be less work than trying to make BI a raytracer due to what happened during the production of Sintel and due to the fact that the shading system architecture would need a bit of shoring up as well before all of that is added.
Anyway, I’m not saying I oppose the idea of improvements to BI, just that one needs to be aware of the work that might need to be done to ensure that new features can be added in a manner that doesn’t break anything and works as expected.
This is great. Results look pretty good. I think it would be great to do a side by side compare between this and Cycles too. Just for reference. Do you know if this affects any of the ‘post’ type effects like volumetrics and hair? I never got into BI (you’re right, it’s “Blender Internal”) so I don’t even know if the different effects are handled through there own separate post processes.
A patch review can be a simple comment only saying
“+1” (I like it) or “-1” (I don’t like it). No
justifications are needed for that, nor a note whether
this is about functionality or implementation.
This reminds me of another idea I had a while back too: I would like the ability to choose a separate renderer per render layer (note: I’m talking about render layers and not passes). That way, you could render most of your scene with BI and then just have a layer that renders only the GI or reflections in Cycles or another layer that renders hair or smoke with BI and the rest of the scene with Cycles. I think it would be a very useful and unique feature.
BTW: did you really mean to say, “I’m saying I oppose” or “I’m not saying I oppose”?
I am very interested in this functionality. Blender is pretty much the only option for baking lightmaps on Linux and it is a pain to not have basic GI options when baking.
So yeah, +1 for a patch, +1 for inclusion to trunk and +1 for more baking options.
For context, you have thought of it, but neither you, nor anyone else is going to do anything about it, right?
@sgraham: Nobody is currently working on BI, if you’re interested in it (which I’m sure at least some users would still appreciate) I suggest to get in touch with actual developers through the bf-commiters mailing list, if you haven’t considered that already.
Baking certainly is one of the major reasons to stick to BI and indirect lighting is commonly desired for lightmaps. It would be interesting to try and create a feedback loop by baking lightmaps and then using them in a subsequent pass as a light cache.
In my opinion, the biggest concern isn’t so much backwards compatibility, but that something breaks in BI due the present amount of messiness and hacks in the existing shading code and fixing it makes the code unmanageable or at worst creates more problems. (based on what I’ve heard from users and developers in the past of course).
So one of the utmost priorities for BI at the moment may indeed be a major cleanup job to help prevent that sort of scenario.
For context, you have thought of it, but neither you, nor anyone else is going to do anything about it, right?
Correct, it’s just a suggestion I put out on these forums as an idea of how BI can retain a nice position in the context of what Blender is today and where it’s going, because Cycles right now is appearing to become the engine for Blender when it comes to raytracing.
I would love it if you or someone could test that for me. For volumes I am skeptical that it would work, because for volumes you really need to step along a path and take transparency into account. Now having said that, I am running the indirect code through the same function used to get the color of a pixel, so it might work, but I doubt it. I should say it might work for Indirect, but it definitely would not work for the current implementation of occlusion. If for example you have a mirror next to a sphere, so the mirror makes it so there are 2 spheres next to each other (1 real 1 mirrored), it would treat the plane as an occluder rather than reflect the ray. This is the current implementation, and I don’t know how much of a need there would be to change that.
You might want to disable environmental lighting in Cycles, the environment will emit light by default in Cycles while it doesn’t in BI, so to get a good comparison you need to be sure that the only light source in Cycles is the point light.