New Renderer - Alpha testers required...

Why did you guys chose Java?
Wouldn’t C or C++ be way faster?

Where is this magical, eutopian renderer you speak of? :stuck_out_tongue: Seriously though, you speak of these things like it’s not possible but if you look at all the renderers separately, they have most of those features and it may be possible to combine them. I just think it’s a shame to have all that technical ability and ultimately artists can’t use them to their full capability.

Anyway, individuals are quite welcome to develop standalone renderers. I think any development is great and I love things like this:

Hand coding a raytracer in assembler is pretty impressive. I had to write a basic raytracer for a uni project in Java once and it is quite nice to know you’ve achieved something.

Why did you guys chose Java?
Wouldn’t C or C++ be way faster?

Not necessarily. BTW, you forgot to put on your asbestos suit.

Ian, that’s some good work. I’m very impressed by the render of the IT department.

Care to explain? I simply don’t know, I’m not a coder, but anything I’ve seen with Java for years have been slower than basically anything. And the joke about the asbestos-suit is that suppose to imply I’m on a flamewar? If so - you should know Better IanC, if you don’t oh well.

I’d love to play with your renderer but I think I’ll wait for the public release as I’m new to 3d

I really like the realtime scene preview

I know what you mean. A lot of java programs are slower than a sleepy sloth. Still, the bread and butter calculations can be just as fast. A well coded java program can be very fast. It may be slower, but if written well it can be unnoticable. A big problem with java is the startup time. I hate having to wait while the vm loads.

I was talking about the fact that commenting on (or asking about) the speed of java vs c(++) often brings out language fanboys, and therefore flamewars are likely, not that you were trying to start one. I’ve been on these forums for long enough to know the civilized users, and you definitely fall into that category. No offence meant, sorry for any caused.

Ian

@IanC

Thanks for elaborating. Good word again. :wink:

It’s a valid question, especially in regard to the area of rendering, so I’ve put the flamethrower on “standby” for the time being :wink:

Reasons why I chose Java…

  • Cross-platform
  • Ease-of-development
  • Speed-of-development
  • Built-in garbage collector

Reasons why C++ might have been better

  • Raw speed for most things (on average).

However, for pure number crunching, all Java is doing is scooting requests off to the FPU. Ray-tracing is highly bottlenecked on the speed of the FPU so I’d be very surprised if a C++ version would be significantly quicker (maybe 10-30%?).

As a pure experiment, I tested a native Java compiler a while ago and my program only ran 2% faster (and that was through the critical-path code sections that make very few Java API calls anyway).

Ian.

@Droid42 (Ian)

Thanks for your answer, I’m learning more now, and I’ll love to test it out
when it comes out as an relatively easy to use solution (Blender -> Gui-> no
need to edit stuff in a txt-editor version).

Currently I’ve been “cheating” by using both Yafray and the Blender Internal
Renderer to get better results than just “one” of the two alone, will be much
fun to see what yours can do in the future.

If you care to know what I weight most in an renderer it’s the following things:

  • Speed
  • Cleanliness of render vs speed
  • Renderer’s capability to handling natural lights and shades vs speed
  • A way to get “dirt/shadows” easily into the minute details (that one is a life-
    saver + a timesaver)… I often use GI to do this (hence Yafray) because it’s
    faster than AO and cleaner too (with Cached pathlights).

Again - thanks for answering and the best wishes to your progress
with your application.

Jeez, how many renderers to we have to use to move on? For us little people who are still struggling to understand Blender’s other features, this is all confusing! What’s wrong with Blender’s renderer? Why can’t they just improve upon it if they’re not happy with it? Why do they have to start from scratch?

Jeez, how many renderers to we have to use to move on? For us little people who are still struggling to understand Blender’s other features, this is all confusing! What’s wrong with Blender’s renderer? Why can’t they just improve upon it if they’re not happy with it? Why do they have to start from scratch?

@3Dabbler
You’re new to this forum (6 posts) - welcome

You’re also right for your concern, it’s VERY wise to learn the existing
renderer - to learn it well…and know it’s strong sides as well as its shortcomings.

It’s my impression also that no one “new” to rendering or modeling in
general can tell the difference between these engines - but those who
have been in the business for quite a while - can!

Some of us use several rendering engines because some engines
have different features that can be utilized somewhat easier than
the “other packages” - and that’s fine - it’s like a toolbox with a lot
of useful stuff - but of course - you need to know to USE them first
and if you don’t have a lot of experience - you don’t know what you
need, with experience …the need grows!

It all depends on the program I think. I’ve always considered Java to be very slow and generally 5-10 times slower than an equivalent C-based program. Similar to results here:

http://bruscy.multicon.pl/pages/przemek/java_not_really_faster_than_cpp_0.html

After all, if Java really had little overhead, I’m sure it would be used far more. However, I would believe the development time could be lower using Java and that’s a very important factor. Plus you have more well documented pre-defined classes to work with.

I personally think that Blender’s code is one of the best choices with a mix of Python and C but you still have to deal with making the interface cross-platform.

Hi all,

Quick update. The first release is ready and I’ve PM’d all those who expressed an interest the details on how to get it…

Ian.

droid42,

I’d also like to test your renderer. Currently, I’m playing with indigo. It’s great, especially how it creates a physically based sky from a blender sun lamp. However, your renderer has me very interested for a couple reasons:

  1. Java - since I used a linux machine, the thought of a native port of your renderer is very appealing
  2. Possibility of being open source. For many people, this is an important criteria for a software project.

BTW - I was the one that created the blender scene of the modern interior that is in some of your screenshots (although the scene itself is not my creation, I just re-modelled it in blender). I’m glad it’s being put to use. I’d be really happy to test it on my linux machine. Thanks.

I’m interested of helping you out, and if this shows out to be a potential renderer, then I would gladly use is as my primary renderer in my company. Just drop me a PM if you like :slight_smile:

Hi.

Well, here’s a first test I was able to do. This is after 20 hours of rendering, and 218 passes by radium. The whole scene is illumated by an hdr probe, without any additional lights. My system is a 1.4 GHz centrino with 757 megs of ram running linux.

http://jsnake.googlepages.com/218passes20hrs.png

So far, here are some of my observations after using radium:

  1. The blender exporter is great. It worked flawlessly on all the scenes I did initial testing with, and my uv-mapped textures were all exported perfectly.

  2. There are some artifacts in the image, specifically on both sofas. If needed, I would gladly share this scene to see why they are appearing.

  3. I’m not sure how to create blurry reflections. I’m not a material guru, so I need to be spoon fed here.

  4. It would be great if a uv-mapped object could be an emitter. For example, if I could set the tv to actually emit light based on the image that was mapped to it.

  5. Indigo has a great feature where a blender sun lamp is exported as a sun, and a physically accurate sky is also created. This leads to very nice looking renders with very little setup. I’ll start an indigo render using this technique right after I finish this post just to illustrate what I’m talking about.

  6. Can radium render an alpha background? Just another feature request :slight_smile:

  7. Periodically, radium saves an rimg file. What type of image file is this? Currently I’m just taking a screenshot since the UI becomes very sluggish after I start a render.

Well, that’s all I can think of right now. I’m very happy with radium at this point - I think it has a lot of potential. If there’s something in particular I could do to be of more use during the testing phase, don’t hesitate to ask.

Just for comparison’s sake, here’s the indigo render after 17 hrs. The only light I had in the blender scene was a sun lamp, and there is no hdri used here. The sky gradient and the sun is created automatically by the exporter. It looks like the same artifacts are appearing in the indigo render as well. I’ve rendered this scene with yafray, povray, sunflow, and blender’s internal renderer, and none of them have created these artifacts. Anyway, here’s the pic:

http://jsnake.googlepages.com/indigo17hrs.png

lakcaj,

Thank you very much for this testing so far :slight_smile:

I’ll insert comments to all your points below:-

1. The blender exporter is great. It worked flawlessly on all the scenes I did initial testing with, and my uv-mapped textures were all exported perfectly.

That’s very good news indeed. You can thank caronte for finally getting me off my bum and implementing the texture export functionality for the blender script :slight_smile:

I assume this means you’re using v.0.00.04?

2. There are some artifacts in the image, specifically on both sofas. If needed, I would gladly share this scene to see why they are appearing.

Yes, feel free to point me to where I can download the .blend and I’ll have a play.

3. I’m not sure how to create blurry reflections. I’m not a material guru, so I need to be spoon fed here.

I’ve added a “neat” feature to v.0.00.05 where all glossy reflections are implemented as micro-bumpmaps (far more physically accurate than using a phong lobe). The Blender export script automatically translates the “HARDNESS” attribute into a suitable bump-map for the material. v0.00.05 will be out very soon … I’m just finishing off the new path-tracing engine (early results can be seen on www.radiumrenderer.com

So, to set up a blurry reflection, all you’d need to do would be to specify a specular component to your material in Blender, set the HARDNESS value to something suitable (10 = very blurry, 511 = not very blurry) and then Radium will do the rest. So far, the micro-bumpmap results appear to be significantly more realistic than just using the phong shader (after all, this is why materials in real life are “glossy” … because their surface is not totally smooth).

For the time being, for the floor in your test scene (nice scene by the way) you could set the diffuse reflectivity (REF) to 0.5, set the specular reflectivity (SPEC) to 0.5, set HARDNESS to 511 and see what you get. If it’s still too blurry, edit the .scn and find the floor material … then, where it says “specular <r,g,b> <k>” set <k> to be around 1500 then you’ll get the effect you want. As above, all this .scn jiggery-pokery won’t be required in the next version…

4. It would be great if a uv-mapped object could be an emitter. For example, if I could set the tv to actually emit light based on the image that was mapped to it.

Yes, I’ve realised this too.:o A very old version of Radium already implemented UV-mapped textures for light emitters, I just need to plug this functionality back into the latest version. It’s very easy to do so I’ll switch it back on in v0.00.05. You seem to have the perfect test scene for that…

5. Indigo has a great feature where a blender sun lamp is exported as a sun, and a physically accurate sky is also created. This leads to very nice looking renders with very little setup. I’ll start an indigo render using this technique right after I finish this post just to illustrate what I’m talking about.

Radium has this too (in the latest released version) but it’s very basic right now. Internally, Radium can paint a sun on the sky sphere and can specify the solid angle of the main sun, the colour of the main sun, the start and stop angles of the corona, the start and stop colours of the corona, and the rate at which the corona start colour merges into the stop colour. However, Radium doesn’t use MLT like Indigo so a tiny sun in the sky will result in extremely noisy renders right now. Again, the new path-tracer will render noise-free with a sun extremely quickly (e.g. a scene illuminated with just a sun will probably converge around 100 times quicker compared to the naive path-tracer).

6. Can radium render an alpha background? Just another feature request :slight_smile:

I’ll put my thinking cap on and see what I can do about that. Not trivial at the moment but as I’ve said to everyone … every feature request in the next few weeks will be implemented as long as I can work out how to do it :wink:

7. Periodically, radium saves an rimg file. What type of image file is this? Currently I’m just taking a screenshot since the UI becomes very sluggish after I start a render.

The .rimg format is a format I developed myself. Basically it contains all the HDR image data, along with session information, namely:-

  • number of passes
  • elapsed render time
  • the filename of the .scn file used to render the image
  • scene file hash code
  • image exposure (f-stops)
  • the assumed gamma of the image (for HDR images this is always 1.0)

The above information allows sessions to be resumed at any point in the future, as long as you’ve not changed the location or contents of the .scn file. Open up Radium, select “Open image” on the console toolbar and select a .rimg file. In the console you should see some information regarding the session that was used to generate the image. You should also see a message along the lines of “session is resumable”.

At this point you can save the image in any format you like (gamma will be corrected and image exposure will be applied if necessary). Have a read of radium/doc/radium_gui.htm and you’ll see everything about image exposure and saving. For convenience, once the render has saved an image, you’ll see a new inon pop up on the console toolbar, which is a shortcut to opening the .rimg that the renderer saved.

In the Image window, if the session is resumable, you’ll see a green arrow on the toolbar indicating that you can restart the session exactly where it left off.

Also in the image window is a “Merge sessions” button (second icon from the left). Click on this and you can choose other .rimg files to merge with the one that’s currently open. Radium will then merge the image data, update the elapsed render time, number of passes etc. and (optionally) delete the .rimg files that you merged, saving the new one. Essentially this allows you to render the same scene on multiple PCs, collect all the resulting .rimg’s together and merge them together (the image will be significantly smoother and the algorithm is aware of how many passes each .rimg represents, so a high-noise image will get less emphasis than a low-noise image). The new .rimg file should also be resumable, so you can merge the output from many systems and continue the render on a single machine if you want.

7. Periodically, radium saves an rimg file. What type of image file is this? Currently I’m just taking a screenshot since the UI becomes very sluggish after I start a render.

Try setting the number of threads to 4 or below if you want snappier responses from the main GUI. Despite me having set the render priority low, and the GUI priority high, the JVM still seems to struggle to heave its way through all the render threads before the GUI sees any action. My dual Xeon renders happily with 512 threads but I get the same sluggish response from the GUI too. At 16 threads the GUI response is instant.

Thanks again (and to everyone else testing Radium at the moment). After the initial “Java3D” issue it seems to be fairly stable at the moment. I’m very excited about the new biased path-tracing mode so expect that in a couple of days…

Ian.

Hi again,

Just a quick point on sky spheres.

In Radium you can specify a procedural texture for any material. For example, to specify a sky sphere that graduates from blue at the top, through to cyan at the horizon, then down to black at the bottom, add the following to the .scn file:-

 
material sky_material = {
    emit 1.0
    map gradient {
        maptype 0
        colour_band 0.0 0.5 &lt;0,0,1&gt; &lt;0,1,1&gt;
        colour_band 0.5 1.0 &lt;0,1,1&gt; &lt;0,0,0&gt;
    }
}
set_sky {sky_material};

This feature, along with the sun, should eventually give you the same physical skies currently supported by Indigo.

There are plenty of other procedural textures supported (including perlin noise, checker, pimples, bumps, pyramid, stripes etc.) along with a basic turbulence function and these will be exposed fully in the material editor once I’ve finished it. The Blender export script appears fairly stable right now so the material editor it second priority to the new biased rendering mode right now…

Ian.

I can’t wait to test it :wink:

Otherwise, the material editor is much needed :smiley: