The defocus node used to have a “samples” option for the preview. the current build doesn’t have this option. At what point did they get rid of it?
I ask simply because it was THE ONLY WAY to get decent quality out of that node. I’ve been sitting here trying to get DoF working to no avail. It looks like garbage. This is especially true now, since they’ve clearly also altered the way the node is doing it’s thing. I have geometry for a tree… Lemme show pictures to you…
So far so good, right? This gives the expected results. Focus is on the tree, the cloud and horizon / hill are nice and blurred, etc. Good so far.
This is a trainwreck of awfulness. The cloud is in focus. The tree should be blurred all to hell, but it’s not. The base of the trunk is, and it’s shadow is. The interior of the geometry is. The edges aren’t. It looks terrible. I’ve been playing with this node for DAYS trying to make it not look like complete crap. This is the best I could get.
So I want to know when they changed the node to this new crap version that sucks. The old version also sucked, but it had that preview sample option. You could crank that puppy up and get really great results, and it would do the blurring PROPERLY, without sections of the image being defocused wrong or not at all. I honestly don’t know how anyone with eyes could fail to see that the preview option gave better results than the ‘normal’ option.
I’m trying to do some animated scenes, and I could use Cycles to render this stuff out, but it takes cycles five minutes to chew through one frame, and it takes Blender’s renderer about 10 seconds. This is a significant reduction. I would like to know what version they axed the preview option so I can download it. I would also like to know what would be the most likely way for me to let them know that the defocus node worked better before. They clearly aren’t aware (nor is anyone else, apparently, as I’ve not seen this issue being brought up anywhere…)
I would like to know if this is the sort of thing that one could use the pynodes to write yourself? I’m assuming you’ll be able to parse the z-depth pass with python. It shouldn’t be too difficult to make a reasonably workable solution to this with python, right? I mean, if they can slap DoF into even the crappiest of indie videogames now, I find it mildly amusing that a dedicated 3D app both A) has to use compositing to do it at all and B) the compositer can’t do it very well, either. Surely there’s someone out there who can tackle the near impossible task of getting even passable DoF working in Blender?