If I’m right, to enable Importance Sampling I have to check Sample as Light for the world, but if I do so, I totally miss the translucence of materials.
However I get a noticeable increment of speed respect of the previous revisions with HDR lighting (and Sample as Light disabled).
???
Edit: @forferdeilig, maybe it has something to do with orthographic view?
Render speed comparision: official 2.61 vs latest builds - I’ve noticed speed up of render time, but it doesn’t work in every kind of scene. Here’s: diffuse, gloss, velvet, translucent.
100 samples, GPU mode
2.61-linux-glibc27-x86_64 03:13.43
2.61-r43253-linux-glibc27-x86_64 02:32.90
Most likely due to the fact that GA builds are almost always faster than official releases. Has to do with the size of the binaries, I think.
a quick test with 500 samples using cpu on Mac OS X Lion
The goal was to try and get Suzanne to look like real glass, i think its came out pretty well.
Its just Suzanne, a plane, an HDR image as environment texture and nothing more will try something a little more advanced now.
But just love rendered glass its like candy for the eyes=)
Sorry for my bad english english aint my native language=)
Phew, just finished massive set of Bidir-Pathtracer tests with the classroom scene:
I used the Difference from noiseless image method, except I figured out a way to do the comparing in the compositor and measuring in the Image Editor(Cool new feature I discovered a couple weeks ago, click a pixel in the Blender Image Editor, and it displays the RGBA, HSV etc. data), and didn’t even have to leave Blender(except for a few pesky crashes).
afaik bi-dir is supposed to give big advantages in interior rendering because of its ability to “find” light paths, in a more intelligent manner than brute force path tracing.
These graphs refer to current early experimental implementation
afaik bi-dir is supposed to give big advantages in interior rendering because of its ability to “find” light paths, in a more intelligent manner than brute force path tracing.
These graphs refer to current early experimental implementation
In addition, storm mentioned it’s only partially complete, it uses only one mutation no matter how many you specify in the mlt panel.
Still so buggy, any benchmarking useless, it can for example ignore important image information because of bug and you think it is advantage in speed, but it not.
Edit:
oops, false start, image too bright, light chain contribution is wrong. But it looks cool, anyway ^^
Nightly Cyclone Builds(Yes, I realize it’s 9:50[in the US anyway])
I managed to get windows building, you have to comment out the isfinite()'s(not part of the working mlt, according to storm. No functionality should be affected). Linux 64 Windows 32
So storm_st, exactly what are you developing: Metropolitan Light Transport or Bidirectional Path Tracing? Or both?
MLT + BiDir, that would be ERPT right?
Trying to enhance current Cycles integrator with light contribution from long chain started from light sources (bidirectional tracer). Same time playing with mitropolis sampling, but it disabled for now, too complex. Actually, i was jumping between that 2 features last ~ 2 month, now when I see first successive pictures i concentrate to finish bidirectional part. The fun is that i make it even more complex, add volume part, it help me a lot for visual feedback what happens inside light chain. It still wrong, i need to get how to collect and weight all light contribution, MIS matter too complex for me. I need to split Brecht optimised code back to not optimised (have trouble with most basic point light contribution), then to understand how to properly mix separate paths light contributions back to one photon packet.
Mitropolis sampling require guessing pre-pass, i have idea how to avoid it for short sampling chains. Maybe it will be less efficient or completely wrong, will see. But later, i need to fix absolutely wrong chain colors first.
Do image textures work in the latest builds?? For the last few months none of the builds i’ve got from graphicall OR 2.61 trunk will work with image textures, all I get is a SOLID colour which is the average of all the colours in the image texture, as if the texture is infinitely small and just repeating all over the mesh