Renderman

frameworld

why do you dislike 3delight?

Unfortunately, it’s not quite that easy, the “perfectly workable” XML file that comes out of the Yafray exporter is sadly lacking information that would be required to get the most out of a RenderMan renderer. The most basic of which is that it outputs only triangles. A really good RenderMan exporter would need to take advantage of RenderMan’s ability to handle high level surfaces natively, such as NURBS, patches, quadrics etc. Also, as far as I know, it doesn’t include important information required to perform proper motion blur.

I’m not saying the approach couldn’t work, just that the Yafray output isn’t the right starting point. Better would be to have a rich XML output that includes all the information required by any potential renderer. Better still would be a properly defined/engineered API. IMHO, and I don’t want to be rude here, the current developer access to the internals of Blender doesn’t really deserve the name ‘API’.

Paul Gregory

I realize that Yafray’s XML file may not be completly what is needed for different rendering packges that is why I added

I guess you could infer from my point that I am basically advicating for an export to a universal XML file.

Which in effect what you are refering to as “a rich XML output”.

I’m do not agree with you on the API front. This is because blender is undergoing a radical re-writes in several areas. By it’s very nature, it is diffecult to tap all the functionality that may be available via an API. If an API is cast too early, they you are either casting in concrete things that may be of a transitory nature, or you are forced to continuously change said API and suffer version incompatibilities which may turnout to be an adminstrative nightmare.

Luis

I beg to differ. Every project I’ve ever worked on has started out by designing a core, this core constitutes the main functionality of the system. Functionality is then layered onto the core, and if the core interfaces are properly designed in the first place, newly added functionality automatically becomes availabe via the generic API.

An API absolutely must not be ‘cast in concrete’, quite the opposite. If the core is properly designed in the first place it should just work. Basically areas of functionality should ‘announce’ themselves to the core (as plugins of course) and the core should then make that functionality available via the same mechanisms to anyone who wants to query for it. Engineering the application in this way would have meant any newly added functionality , be it as part of the main project or a third party plugin, would automatically be available via a C/C++ API, or even, if properly defined, Python, with no effort on the Python or API developers side. The fact that each time a new area of functionality is added to Blender, the Python developers have to work to catch up and include that functionality into the Python ‘API’, highlights just how poorly designed the internal core is.

Under no circumstances should an API for a product of the nature of Blender be hard defined. It should be fluid, automatically fluid, changing with the product as it changes. Just see Maya or 3DSMax for an example of how this works in practice.

Paul Gregory

An API absolutely must not be ‘cast in concrete’, quite the opposite. If the core is properly designed in the first place it should just work.
I am not saying that it would be designed this way, I am saying that this is a consequence of trying to cast an API when the design is in flux, or at an embrionic stage.

It is not a matter of being properly designed or not, it is a matter of what it was designed for. As far seeing as Ton is, I doubt that he had envisioned making Blender open source. He designed a tool that was appropriate for the task he was undertaking. The fact internal access to the tool is being exposed means that certain things must change. If a project has as its aim the exposure of certain functionality via APIs it is designed a certain way, If this is not the original aim, then retrofiting functionality can be very detrimental. I do not mind them taking their time to provide us with the appropriate functionality via well though out API. All I am advocating for is that in lieu of these API we can make use of a well defined XML file. This would lessen the demands of having direct access to blender internals.

Basically areas of functionality should ‘announce’ themselves to the core (as plugins of course) and the core should then make that functionality available via the same mechanisms to anyone who wants to query for it. Engineering the application in this way would have meant any newly added functionality , be it as part of the main project or a third party plugin, would automatically be available via a C/C++ API, or even, if properly defined, Python, with no effort on the Python or API developers side. The fact that each time a new area of functionality is added to Blender, the Python developers have to work to catch up and include that functionality into the Python ‘API’, highlights just how poorly designed the internal core is.
I am sure that some of these ideas have been considered and may be in the process of being implemented. I have not kept up to date with the developers logs to tell you one way or the other.

Under no circumstances should an API for a product of the nature of Blender be hard defined. It should be fluid, automatically fluid, changing with the product as it changes. Just see Maya or 3DSMax for an example of how this works in practice.

And precisely because of this is that an API should not be rushed. I have nothing against APIs, I have a lot against rushed API’s.

I rather wait a long while for Ton and his team to get it right that suffer the consequences of a rash job. Now to buy that time while people are clamoring for a way to port to the render du jour I am suggesting the Rich XML approach.

        Luis

Actually, since shaderman (and shrimp, a linux-based version of the same) already exist, then there really needs to be no duplication of effort to create a shader editor. That being said, Akhil’s ribkit/sler tool is a decent first stab at a tree-based shader editor.

As far as blender’s material go, work is in progress to support those, as well as blender’s lighting model. It may well be impossible to provide a 1:1 mapping between blender and renderman, but nobody said I couldn’t try did they? One major stumbling block is the fact that I have no access to Nodes from Python…but hopefully that’s being changed too.

For Neqsus/BtoR information(For all those interested) you can visit http://wiki.aqsis.org/guide/btor to get started with using it. The current SVN version is useable (as evidenced by Messiah’s results), but there’s still a lot of stuff I want to add, and documentation to be done.

A few tips:
This is not a lightweight script. If you expect to open a million windows like the screenshot on the Aqsis wiki, be prepared to experience sluggish performance if you’re running an ATI card. The “blender OpenGL slowdown” bug is in full-effect there.

For fun and exciting shaders, visit http://www.renderman.org and surf the shader library there. See the aqsis wiki for notes on using some of these shaders with Neqsus.

When constructing models for use with Neqsus/Aqsis, you might find that you need to modify your construction methods a tad bit.

NURBS surfaces will need to be converted to meshes (for the time being…the API support for NURBS surface access will be in the next blender release).

Note that subdivision “levels” in the blender subdivision modifier makes no difference at all to the Renderman output you’ll receive.

If you receive errors regarding non-manifold data from Aqsis, then visit this page: http://wiki.aqsis.org/faq/index?s=non%20manifold

Also, pay close attention to your object’s normals, since that will also result in non-manifold data when the object is subsurfed.

I can generally be found in #blender, #aqsis, #blendercoders and #blenderchat on freenode if you have problems.

ShortWave

See above post

Shortwave, I must say, very impressive reply and good collection of links! Did anyone ever think of just asking the guys at Pixie to help out? They are pretty much into the program Liquid (maya to renderman), but they could give us a hand! Pixie has the features and possiblities and is open source, something 3delight is not (Chekunen :D) Anywho, Aqsis, Personally I believe, doesn’t have enought features, and progresses too slow for Blender, which updates fairly quickly.

I think Aqsis has enough features to make it worth while, all it really needs is raytracing and GI, in my opinion. And it never accourd to me to ask the people at Pixie. I’ll go ask.

For the record, Neqsus/BtoR is intended to support the Renderman spec to the best of it’s ability, and then renderer-specific functionality. I’ve added renderer-specific settings thus far for Aqsis, BMRT, and Pixie. 3Delight, RDC, Air, and Angel support will be forthcoming as I find time. My primary interest in the open source arena, less so the commercial, but I’ll eventually cover all bases.

The bottom line is that while the “renderer-specific” details might not be in place yet, any .RIB file you create with Neqsus should be renderable with any renderman renderer.

Aqsis has plenty of feature support, by the way. Why would you think it doesn’t? And incidentally, what does “It progresses too slowly” mean? Why does the development rate of the modeller have to affect the development rate of a renderer that (currently) blender has no native support for? It’s really an apples to oranges comparision, and has no valid basis.

Also, Aqsis is open-source as well. I’m curious what “features” you think Aqsis is lacking.

And in my own personal opinion, while raytracing and GI are nice, raytracing tends to be SLOW and GI…well, GI can be hard to control when it comes to animation. I personally consider GI to be something of a crutch…you get better results if you pay attention to detail when lighting your scene.

ShortWave

I don’t have a problem with othertypes of renderers, I just like raytracing best. I’d like to try Aqsis some more, but I forgot the names of a bunch of python scripts, and a lot of them don’t work with newer versions of Blender.

EDIT: Oh, on the raytracing being slow part, have you seen the ray-tracing in Yafray? It’s really fast, IMO.

Shortwave, ease it on down my friend, not trying to start anything. But to answer your questions Pixie has more of the same features PRman has, that is all I am saying. If you are more Comfortable with Aqsis then fine, I have no grudge with that. I am just comfortable with Pixie, seeing as I used it with Maya and got pretty good results. And another thing: Pixie’s AO kicks any Renderer’s Ass (with exption to PRman) and they release more versions faster. So yes, I do beleive Pixie is updated faster than Aqsis. Aqsis has had plenty of time to gain features (and when compared stability the potential upper hand) on Pixie, but they didn’t. Speed, Pixie is much faster, how ever Aqsis is more user friendly. I even Admit Pixie is a bastard to work with at times. Ya know what forget the independent renderers. Let the blender comunity create their own Reyes renderer, like Ton hinted at doing. That way everyone is happy!

Man, I would like to write my own renderer, I’d call it Suzzane(because it would be 1:1 Blender compatible) and then I would make it Open Source, so if anyone wants to integrate it into Maya or something, then they could, free of charge. That’d be cool.

But what would I have to learn?

On another note, sorry, Shortwave, I didn’t see your link to your wikipage on the exporter. It look awsome, I’m going to try it.

-How to interpret different lamps
-How to recognize different materials and textures
-How to do reflections and refractions
-(optional) How to do photon mapping for GI, caustics, and SSS

You’ll need to learn quite a bit if you want it to be compatible with Blender and then have it support its features.

Shortwave, I checked the wikipage and the intergration looks quite good, if this leads the way for good photoreal renderings while supporting the node system and Blender’s procedurals., I may consider using it.

To backup ShrtWave’s words and work, i’ve tried BtoR/Neqsus with both Aqsis and Pixie and i can assure you that it’s working well with both rendrerers.
The real thing is to take advantage of each one. When Pixie is faster to render a simple scene with surface shaders and raytracing, Aqsis is way faster when it comes to render displacement shaders like in the pic i posted in this thread. Moreover, you just have to tweak some settings in the exported RIB file to get some more functionalities working but i can say that learning a bit of renderer spec and renderman spec is the least you have to do.
I don’t think that Neqsus should support blender nodes as it’s not as powerfull as say Shaderman which is designed to create renderman shaders. In this way, i think that this support should come from blender, where the node system is good but still needs improvements and at the same time, could become fully compatible with renderman shading language (like maya’s). In this way, the export could become much easier and the script would just have to get the shaders and compile them automatically for the renderer in use. Though, all this is just ideas, i don’t know if this is really doable or not but still, that would be great.

1 Like

Thanks for the reply, Messiah.

In truth, I’ve tried to reduce as much actual tweaking of the .RIB data as I can. The renderer-specific attributes panel is intended to allow a user to change those “specific” items in the UI without having to trace through the .RIB for some object or another. There’s some more advanced functionality on the back-burner for manipulating objects and lights and other interesting things…so once those get put in, you’ll start to see some real power emerge.

ShortWave

I’ve seen this comparison and conclusion a lot recently. I assume that you have rendered the same content in both to reach this conclusion? Would you mind providing mw with the source to those renders so that I can determine the cause of the speed difference? As far as I know, except in a few ‘edge’ cases, Aqsis isn’t too far off the pace these days.

Cheers

PaulG