Renderman

After seeing “Elephants Dream” I was impressed by how it looked. How does it stack up against Renderman, the program Pixar uses? Does anyone know?
Blender seems so professional to me. I have seen a few other videos by other free 3d software and they are a toy compared to Blender.

digiman

Blender IS a professional tool - Period.

I think you’re confused. The word RenderMan has multiple meanings. Take a look at http://en.wikipedia.org/wiki/RenderMan. Also, http://www.renderman.org/

RenderMan (actually Photorealistic RenderMan) is just a rendering engine, not an actual 3D modeling/animation program. Pixar uses software like Maya and then renders the scene using RenderMan. So in theory if you have any 3D program, and a working linkage to RenderMan, along with the licensed copy of RenderMan, you can render with it. Of course, the software isn’t cheap, and so there are other rendering engines that use the Renderman standard (REYES renderers), but as of now none are tightly integrated with Blender. Ton & the other developers are planning to add that possibility in a future version of Blender.

cannot wait until we can get that to work !

Hear hear! :stuck_out_tongue:

Thanks Roofoo, i did not know that. I understood that renderman was a program like Maya made by pixar. Good to know, anyway i was just curious. When I started with this program a few months ago I thought it might be a waste of time ( I have tried about 4 or 5 free modeling programs they were toys in comparison) It turns out that this program now that I am getting used to it is the best and the forum is very helpful.

digiman

Pixars Renderman is one among few.

others are:
3Dlight - industrial strength

Pixie and Aqsis are strong as well.

Cekuhnen,I think that for a good renderman support we’ll wait forever!
In the past 5 years I have seen too many discussion about renderman support,we all want it,but no developer have the desire/time/knowledge to do it,and without developer,this is a thing that isn’t going to happen automagically.
I hope I’m wrong,I really hope…I want renderman support too.
Bye

It’ll come. These things take time y’know. Getting full support of renderman’s capabilities takes a fair amount of work. I know that I’ve thought of numerous ways to get it to work and written pages and pages of ideas on how best to do it with Blender in it’s current state. You have to know Blender’s API as well as the Renderman spec and try to get ways where it’s not only possible to convert one set of data into another but in a way that makes sense to the user, doesn’t require a developer to setup and is also reliable and flexible enough to use in production.

There are a few flaws in Blender’s own APIs that cause issues. One is that the Python API doesn’t read the modifier stack so you can’t read what geometry format to output. If you do it manually through the script per object, how do you apply that setting to multiple objects? How do you apply the same shader to loads of objects and how do you remember those settings? How do you animate shader input values given that Blender has fixed IPOs for individual objects? There also seems to be a bug whereby if you try to read the crease values of edges on an object that has a subdivision surface modifier, it fails. So, in order to get Python to read the crease values from blender you have to remove the modifiers from your objects. Then you’ve got lights, particles, shadowmaps and deep shadowmaps, UVs, arbitrary output variables, passes, curves, imager and volume shaders, motion and deformation blur (which btw are reason alone for wanting to use renderman - Renderman hardly increases the time for rendering true motion blur, unlike Blender’s factor of 8) etc.

I’ve managed to get a usable script and I’m trying to improve it as often as I can. I’m sure you’ve also seen how much work Shortwave has done as well as the guy who made Ribber. Trust me, work is being done to get a workable renderman interface from Blender.

Ok, here I go sticking my nose where it does not belong but, why do we have to constantly re-invent the wheel? We currently have a relatively reasonable rendering system with Yafray, it takes a blender scene and it is capable of generating relatively good output. It is cognizant of UV texture, bump maps, and other procedural textures. It understands Meshes and Curve derived objecs. It generates a fairly comprehensive XML file that describes the scene. You see where I am going with this? Why not use the XML file generated by yafray as a starting point and use a filter to create a reasonable rib file.

Yes I now it is not as simple as that, but this is a multipart problem as I see it, extracting an image from blender, and feeding this image to a different rendering engine. If the current yafray XML file is deficient maybe creating an enhanced standard XML output with all the bells an whistles that the different renderers (yafray included) need may be an answer. I think that having each render engine do its own interface to blender is wasteful, prevents diversity and promotes complexity.

Just an opinion.

Luis

hey

I did not say nobody is working on it. There is no need to ask shortwave all the time about his progress. When it is done it is done.

I also understand that Renderman integration is not a weekend project.

I only mentioned that I am looking forward to use it.

Hi to all :slight_smile:
Actually, the work of ShortWave is pretty impressive.
I’ve tried his exporter named “Neqsus” (formerly BtoR) and it is very promising.
I also gave him feedback on things i thought needed improvements (just a few things, nothing that must revamp the hole thing).
As for what i’ve come out with, it’s pretty satisfying but i’ll let you check on this render i made with Aqsis renderer. The model is made with Blender and exported via the Neqsus interface. The SL shaders are made with Shaderman and copiled for renderer afterwhile. The model has one skin with SSS shader (based on the SSS shader from the MakeHuman site), a displacement shader, and a shader for the eyes. All textures have been prepared with photoshop.
The render took about 7 minutes with a res of 1280x1024 px.
http://www.putfile.com/pic.php?img=3579464

Cheers, Messiah

O_O i want THAT!!!111

Ton is working on improving the API for integration of external renderers, and specifically mentioned renderman integration in his talk at Siggraph. When that will come about is a worthwhile question.

Ok, here I go sticking my nose where it does not belong but, why do we have to constantly re-invent the wheel? We currently have a relatively reasonable rendering system with Yafray, it takes a blender scene and it is capable of generating relatively good output.

Renderman et al has a large preexisting body of shaders, and a large number of renderers understand the RIB format. Renderman is a renderer that is in wide professional usage. Renderman specification is an industry standard. Those are pro renderman reasons.

Why not use the XML file generated by yafray as a starting point and use a filter to create a reasonable rib file.

More direct integration is desirable so that the materials, lighting, interactive rendering, and other features can be used. Also the time to generate an XML file for a scene of any complexity will be rather time consuming.

LetterRip

LetterRip,

Sorry from your last post it is clear to me that I failed to get my point accross. I did not mean that we should stick our heads in the sand and embrace yafray as the end all renderer.

My point was, there is a great integration already with Yafray, part of this integration is the exporting of the Blender scene to a perfectly workable XML file.

Part of my point was, if this is already there why not strive to use the XML file as the source to the translator ( transcoder actually). This reduces the work that has to be done in blender.

I guess you could infer from my point that I am basically advicating for an export to a universal XML file.

All Renderer ports (Kerkythea, Generic RIB, Yafray, Reyes, povray, etc. ) could use it as a starting point.

This means that Blender’s development would be contained to the generation of this file and and would avoid being jerked around with requirements for each new rendering package.

This is not to say that this file would be static, as new features are required, or flaws or deficiencies are resolved newer version can be released but it would be much less of trying to hit a moving target.

Luis

I know what you’re talking about, BOr3d. I guess the best reason would be because if they started from scrath, the code would be cleaner(a problem that Blender has), and everythign would be faster.

Wow, Another Blender to Renderman page! Well, I will tell you first hand (I have experience with REnderman and MAya) that both pieces of software are elite in there catagories. Now Granted Blender is absolutely amazing (hence why I switched to it), and If we were to get Renderman Support then It would matter what features MAya had, like fur or hair, seeing as we would be able to simulate that in Renderman, better looking and faster. Oh yeah thats another point. I doubt many of you realize how fast PRman is! Its the number one rendering engines in the world for a reason! Ton is supposedly working on an REYES renderer for blender. This could only mean a closer competion between us and Autodesk (owners of both 3ds max and Maya as well as a slew of CAD programs). I am rooting for us and Softimage (XSI) to kick Autodesk’s ass! I love the underdog, thats why I love blender: it has so much ability under the hood, and will hopefully be able to fully go one on one with the big boys! I know it will. We NEED RENDERMAN! This has been thought on long enough, and the programs that did come out didn’t really do it well. I say we take the best one so far, Blenderpixie, and update that. Intergrating it into the software. Pixie has the power and posiblities as PRMAN just as Blender has the Power and possiblities as MAYA. They were made for each other!

Speak, Brother, speak!

renderman intergration would be nice, i am not really interested in material export just reliable mesh export with subsurface,

Yeah, BE, its better to assign them in the RIB its self. By the way whats the word on shaderman? Now think Making the node material editor like that of RAT or Shaderman (both use nodes). Then you have another step to becoming intergrated with Renderman!