ugh... YafRay + "Do Composite" = does not composite?

i seem to have run in to a huge problem that i can’t find any reference to, yet it does it across several platforms. this is, of course, a time-crunch situation where i’ve picked up Blender over 2 days for a project dumped in my lap. it would be nice to have this all done by tomorrow (01:00 now), but i’ve just blown three hours researching this problem.

rendering with YafRay completely ignores “Do Composite”. it outputs rendered frames or sequences, but it does not post-process frames nor hit the File Output node in my composition. YafRay does get post-processed when i update the composition’s Render Layer. using the internal renderer correctly outputs composited frames, including the frames from the File Output node. it just seems that the rendered YafRay frames aren’t being fed back in to the composition.

this affects the ANIM and RENDER commands, and i’ve noticed another oddity: if the File Output is at the end of the chain and it’s the only output node there, i get “No Render Output Node In Scene” on re-render.

is this just some thing that’s simply not possible, and also not mentioned anywhere? or am i just totally missing something?

here’s the composition and render settings:

notes:- Blender seems to do this on OS X PPC & Intel (py 2.5 build), Win32, and an OS X PPC trunk build. i tried a sample file to see if it was just my .blend, but it does not appear so.

  • this is a high-quality rendering of a jewel, and all the lighting is set up for YafRay. using the internal renderer is not an option.
  • this is an animation, 150 frames. it would be nice to solve this in the most automated manner possible.
  • the composition is nothing special: a standard glow effect (blur, AlphaOver, etc.) and a z-mapped Defocus.
  • i would just do it as an image sequence with the same composition, if i could figure out how to process the image sequence… it doesn’t seem to advance. i’m going to render a z-data version overnight, in case i can figure it out.

in short: help! thanks,
- emilio

emilio,

Welcome to the forums!

To the best of my knowledge, compositing only works with Blender internal render.

  • Pretty sure the Manual (Compositing Nodes/Animations) covers this pretty thoroughly. Make sure you’ve identified the Image Input as a Sequence, set the start frame and frame range, and enabled the little “auto(mobile)” button at bottom far right. Then your image sequence should advance with frame changes in the UI.

emilio, I am wondering about something…did you know there is a user manual? I’m NOT trying to insult you, but I’m thinking that the manual needs to be somehow…i dunno…better advertised, if that’s the word…maybe more detailed help topics or something…

lol, no insult taken! like i said, i scoured everything i could find for hours and didn’t find anything that says you can’t plug YafRay in to the compositor via ANIM or RENDER. unless i completely, utterly missed it, that’s a big hole in the manual. on a related note, i can’t figure out why it doesn’t work if i can still send previews from YafRay to the compositor. if i could code… well, you know that story.

as for the image sequence loading, that is probably my noobish sub-competence. i fiddled around with it today and got it to work before i caught the replies, and tonight i’ll crunch the frames with the sexytime DOF and glow (yeah, tonight, 1600x1200 and that DOF is expensive).

despite battling constant movie compressor crashes (yeah, i see why you do frames), off-kilter models, and a mystifying workflow, i’ve come away pretty impressed with Blender from just my “simple” project. it’s pretty fast, and i’m amazed that it hovers around 10-15MB. getting Blender and YafRay in <20MB is awesome for quick distributed rendering. i’m kinda glad work nudged me towards it, as it’s been years since i’ve touched the modeling/texturing/rendering world. that PayPal button is definitely going to be clicked, on work’s dime (of course).

since you brought up improving the documentation: yes. the “Quickstart” is laughable - a few tutorial links and far too many words. anything labeled “Quickstart” should treat the user like they’re 8: they can manage their own learning, but you still need to thoroughly cover the basics. the Quickstart should basically be a simplified version of the features page where every bullet point is a link to a tutorial or wiki page. at the top of the page should be a written and graphical overview of a typical Blender workflow with a few links to interface-related documentation. how it works, how you use it, and what it can do.

thanks for your responses! there are a few more questions, but those are for another post…
- emilio

well, the compositor and sequencer work on rendered images. If Yafray is the renderer, then…the compositor and sequencer don’t have anything to work on. The RenderLayer input node invokes the BI render engine which renders the scene and feeds it into the compositor.

To use Yafray with Blender’s compositor, render the image using Yafray, and then use the image input node to get that image into the compositor where it can be post-pro.wiki updated with this explanation.

awesome, thank you! it makes much more sense to think that other stuff is being passed around that YafRay doesn’t have access to. i need to read up on scene strips and optimize my workflow a good bit, as i’ll probably doing lots of renders of this in the future.

it would be nice, for the sake of one-click long-haul uber-processing, if there was comp node access to YafRay that was triggered by “Do Composite”. maybe a special input node or type of render layer that would handle the reduced I/O? in the meantime, i might check out writing some Python to handle triggering Image node processing on a render frame completion. whee, just like Quartz Composer in reverse!

thanks again for the quick change there, PapaSmurf,
- emilio

There is, just load the images generated from YafRay into the compositor using the Image Input node. Click Do Composite and there ya go.

But yafray still can’t output anything but RGB, and you can’t do things like Vector blur and stuff, even when exported to an EXR sequence.

Oh snap! there’s a manual! I’ve been using the compositor since it came out and I’ve never known there was a manual. Cool!

Yeah, more advertisement would help I think.


Kevin

Render out the normal, Zdepth etc layers in blender internal, render out the beauty pass in Yafaray, when you need to do the DOF use the Zdepth layers (and what ever else you need) from blender internal, that should get you by :slight_smile: