New Renderer - Alpha testers required...

Hi all,

Subject says it all…

I should have called it “YARWABES” (guess what that stands for … :wink: ) but I’m going with “Radium” for the time being.

It will ALWAYS remain free and maybe, at some point in the future, open-source (when I’m happy with the state of the code which, knowing me, will be “never” :wink: … joking).

I’ve been working on this on and off for the last year or so and, due to various circumstance, I’ve had a fair bit of time to polish up the main render engine and GUI in the past few weeks.

The main focuses (foci?) of this renderer are:

  1. Easy navigation and preview of the scene in real-time (constantly updating tiled view, Blender-like viewpoint control, solid and “real” preview modes etc. etc.)
  2. Complete control of every parameter, material, object, light-source through an Explorer-like interface.
  3. Progressive rendering using path tracing, plus configurable bias settings to merge path tracing results and more conventional ray tracing methods in the same image. Support for multi-processor and basic support for multi-machine rendering.

Needless to say, it ticks all the usual boxes such as HDRI, texture mapping, bump mapping, procedural textures, arbitrary combination of shaders, basic primitives (sphere, box, square, disc, torus, cone, cylinder), UV-mapped triangle meshes with smooth normals etc. etc. No point me repeating a long list of features already supported by every renderer that you’re all already familiar with.

Internally it can do a great job of rendering GI with photons but I’ve disabled photons at the moment. I’ve come to the conclusion that they’re are a pain to use and many hours fiddling about with irradiance cache settings, shadow refinement etc. can be much better used for fnal rendering. I want to stick with a “fire and forget” philosophy, which means … tweaking materials -> seeing what they look like in real time -> positioning the camera -> setting up the lights -> start final render -> go to bed -> wake up -> inspect results :wink:

You might already have seen a couple of images I’ve produced recently on the Contests forum (see “IT Department” and the original “Renderer Challenge” threads).

Here’s a link to a movie that shows off the real-time preview and object/material/shader/image browser as they stand at the moment.

www.trackvids.co.uk/radium_movie01.avi (21Mb … please “Save Target As…” before viewing)

Here are some links to various screenshots…

www.trackvids.co.uk/screenshot01.jpg
www.trackvids.co.uk/screenshot02.jpg
www.trackvids.co.uk/screenshot03.jpg
www.trackvids.co.uk/screenshot04.jpg

I would really appreciate a few volunteers to help with 3 phases of initial testing, likely to start within a fortnight or so…

1 - Manipulation of preview scene (with a view to seeing how the GUI looks and operates on different OS’s)
2 - Using the material editor and associated previewer
3 - Using in conjunction with Blender (I have an export script at the moment but it needs more work)

I’ve also previously implemented and tested a scene previewer using Java 3D (getting close to 40fps even with complex scenes) but I’ve disabled it at the moment. I’ve recently moved to a WinXP64 development platform and, unfortunately, Java3D doesn’t work on 64 bit yet (but I’ve been told the next version, due this month, will). This is a VERY cool feature and was worth the many, many days figuring out how to convert my internal scene data into a J3D Universe.

I’ve also developed a textual scene description language that is not disimilar to POV-Ray’s (i.e. procedural and turing-complete) but easier to learn and use (IMHO). Creating a test scene with bump-mapped and texture-mapped primitives takes a matter of a couple of minutes armed only with Notepad.exe.

Please feel free PM me or reply on here if you’re interested. Hope to get at least a couple of guys willing to thrash the hell out of this thing to find the (doubtless numerous) bugs … :wink:

Thanks very much in advance for any volunteers and finally … special credit to Christopher Kulla (fpsunflower, author of the excellent Sunflow renderer) who has done some sterling work on KD-Trees, so rather than re-inventing the wheel, I based my kd-tree on his :smiley:

If anyone wants to suggest other forums this might be usefully posted on, that would be cool.

Ian.

Micropolygon displacement mapping, good anti-aliasing (temporal and spatial), programmable shaders (i.e not just combining premade ones)?

It looks pretty good so far. Can I ask why you decided to make another renderer instead of develop an existing one? We get so many renderers made by one or two people and if you guys got together, I reckon you could make a seriously amazing one. The guys that made sunflow, the aqsis guys, the Pixie guy, the yafray guys. As you’ve discovered using the code from sunflow, mostly you are trying to achieve the same goal so instead of having multiple scene descriptions and shading formats, can’t you come up with a standard between all of you?

You could use the Renderman standard and put sunflow/radium as an extension to the renderer.

No, yes (you get it for free with path tracing) and “nearly” (I am about to improve the performance and storage requirements of the SDL, once this is done then programmable shaders will come for free with a simple tweak. On the “to do” list…)

Does anyone remember the BBC Microcomputer Model B … one of the breed of popular home computers back in the early 80’s? I was about 14 when I first owned one of these and about 16 when I wrote my first raytracer for it. It started off in BBC Basic but I soon realised this was far too slow to do my renders any justice (one diffuse sphere and one mirror sphere sitting above a chequered plane was taking about 8 hours to render at 160x256) so I hand-crafted the whole algorithm in Assembler. This improved things no end and I then had the performance capacity to render refraction and bubbles within reasonable amounts of time. Eventually my ray tracer could be left in a dark room, in front of an open-shutter camera and, hey presto, through the wonder of compositing, I could get 35mm 16M colour ray traces from the humble 8-bit-colour BBC Micro. During all of this I had to write my own floating point assembler library.

When I taught myself Java last year due to extreme boredom from being off sick from work for so long, the second thing I did (after a Mandelbrot explorer) was “I wonder how easy it would be to relive my youth and write a ray-tracer now?”. Similar scenes that took 8 hours on a 1983 home computer took around 50ms on my P4 … and then I just tinkered, added, tweaked, prodded etc. and here I am a year later. It’s basically a hobby, and it’s fun to meet the challenge of achieving something oneself. I suspect a lot of the other guys share this sentiment with me.

On a serious note (although all the above is actually true) … my core renderer isn’t the fastest, or the slickest, or the most feature rich, it can still produce some good stuff (my first ever completed image came 3rd in the IRTC at the beginning of the year and see contests forum) but I think the GUI is starting to look very promising indeed. I’m very proud of the Java3D previewer and it’s a damn shame I can’t share it with anyone right now.

Regarding your other comments about everyone pooling their skills, I agree that’s the way forward. I’ll definitely release my GUI code soon and we’ll see if anyone wants to pick it up. In particular, I think Sunflow is very very promising indeed (I’ve delved the source code quite a bit and I love its structure and internal architecture) and I reckon it would take Christopher less than a day to hook my GUI into it (seriously).

Thanks for the comments and apologies for the long-winded reply … :smiley:

Ian.

not to be rude but osxrules has my point as well…

But even beyond that theres the fact that GPU rendering is going to take over even faster once someone “gets it” hobies are great to have but finding real beta testers that want to test something out cause they really need to use the tool for real work is to far in between now.

For pros teh likes of Maxwell came and well LOts of users hopped on, even though the price is sick… Your creating it is java, theres no reason we can’t give a greater challenge instead of making YASP

Couldn’t be further from my mind. Lots of people render as a hobby too… :wink:

Ian.

so, how can i test it?

I would like to test it.

Hi Ian … i would like to test your render . I am very impressed by the quality you already showed with your render and having a material editor makes it very interesting to me . But i have to admit that i dont have to much time because i am already dancing on to many weddings at the same time :wink: . I am one of the Administrator/moderators at a different Forum , Beta tester of another ray tracer and also writing the second part of the Material editor Tutorial for that ray tracer ( hope to finish it this week :wink: )… so there is not so much time left .
But i would certainly test it , even i dont have the time to test it in depth for the moment …

Keep up the good work , it is realy apreciated :cool:

Greetings Patrick

I’ll be keeping watch on this thread!

Koba

bringing up the point of joining forces is not rude.
it is actually something i asked myself often a lot.
for example why dont they just put yafray into blender
directly so it will work 1:1 with it besides some differences
like the light models.

but often life is different than what you want !

i completelly agree with cekuhnen.

Please all those who want that people/coders join forces to create the perfect never seen before Renderman shader supporting unbiased photo realistic but also NPR with microdisplacement, supporting particles and complex IOR + sss and dispersion accurate at nuclear quantum level and as fast as Blenders ray tracer not mentioning that it also should have all features that are available in Maxwell and Mental ray, v ray , with an 3D Max or Maya UI all intergated to the next Blender release …:rolleyes: could you please open a new threat to talk about this very interesting things because here sombody is willing to share his work for free and asking for testers to improve his ray tracer

Thank you

Greetings Patrick

yeah but it’s liek the old long post from sunflow… Infact you could mirror all of those comments over to here and it would have the same out come.

We who oppose are just trying to drill into some developers mind that it might be better to team up instead of make yet another rendering program… It’s going to be good after awhile, but not yet, it’s going to have bugs, lots of them… mirror sunflow thread

What I guess I am saying is that do whatever, but it sure would be nice to see someone instead say “We developers of a new rendering program are teaming up to combine teh best of both worlds and are intergrating the full of our work into Blender source so there will never be a need to export to scripts in a brand new renderman format like structure”.

“We who oppose”? You mean you oppose the concept of free software, especially if it’s not exactly how you want it to be? I just want to be clear on that point…

Of course it will have bugs. This is the fruits of my efforts over a long period of time. I’ve had enough experience in the IT business not to be so naive to think that however much I personally test it, it won’t have bugs. Someone will come along, use it in a new and different way and bang there’s a bug. That’s software development in a nutshell and that’s why we test software.

Should hobbyist programmers be having stuff drilled into their minds? Seriously? For which mighty power are we working except that of our own enjoyment, curiosity and sense of challenge and satisfaction? What’s wrong with sharing? If we do share and you don’t like it, don’t criticise. If you were my employer then you’d have every right to but, basically, you’re not.

Perhaps if you’d not used the phrase “drilled into”, (implying that somehow I and others work for you) and instead used something like “gently coerced through mass encouragement” then maybe your post wouldn’t have come across as quite so, ummm, can’t actually think of the right word without sounding offensive (and no offence intended at all, honest :slight_smile: )

Ian.

Ah, dont mind me, I pick incorrect words all the time. Like I said it’s all good and i’m not insulting your work, just being nit-picky.

Lol, no worries :slight_smile:

Ian.

I often feel embarrassed :o by this kind of post because it often looks like if Ton or in general the coders behind Free or open source software are working for the users and it is not this way . We are lucky that we can use this software for free !
Now what makes me feel angry is that you point out projects that are made by one person and when developing is not so fast ,then it is = to a failed project but you dont realise that Beta testing is a important part of software developing . If the people dont show interest in a project , how do you think the developer will feel about improving his software .

Normally things gos like this , sombody wants to share his software with the people . At the beginning everybody tries it out but if they dont get a good result at the first time they stop posting there opinion and trying the software . Now this is not Beta testing , this is trying a software and if it dont work as you expected then it is crap .
Wrong , wrong , wrong … What you realy want is that other people do the “hard” work finding bugs and reporting it , giving ideas how the software could be improved in Work-flow or missing features and of cause the coder doing all the fixing …and after all this is don , you use the software to show your render or most probably asking how to do this and how to do that and demanding tutorials …etc .
This is not the spirit of Free or opens source software … if you get something for free , you could maybe collaborate and not just wait for others to do the work .
Without feedback there is no free or open source software .
My self i made the positive experience with collaborating with a one man developed software ( i am not going to make publicity here ) and probably most of you dont realise how powerfully the program has become in quality and features …
And for the end i want to say that it is not how many coders work on a program , it is how good they are and with one good coder it can be enough :wink:

Greetings Patrick

Hi,

You can count on me in alpha testing :slight_smile:
When and where could I find distribution of Your renderer?

Tom

You’ve sparked my interest. Is your renderer available for Linux?

It’s 100% Java so fingers crossed it works.

The ONLY thing I can think of that’s non-platform specific is the signal handling I’ve implemented to allow graceful shutdown of non-gui renders (a platform-friendly shutdownHook didn’t meet my requirements for various good reasons). I believe that the Signal and SignalHandler classes are supported across all platforms in the Sun JRE and maybe others too but I’m not 100% sure.

Thanks to everyone for the interest so far. At the moment I’m lining up interested guys to do some testing/playing in advance of it being ready for a first release early next week (I’m targetting Monday 11th Sept) with full material/object viewing capabilities (including my current Blender export script), one week later for the full material editor and another week after that for the Java3D scene navigator (as I said before, the Java3D integration code is fully tested but my current development platform doesn’t support it at the moment).

The first release next week will come with a bunch of scene data. Editing a material in a scene file take around 30 seconds with a text editor (it’s a pov-ray like SDL but much easier to use), e.g. a diffuse and slightly-reflective, perlin noise bump-mapped material with an image texture map is simply specified as:

 
colour c_white = 1.0;
image i_image = {file("filename.jpg")};
 
material m_material = {
    shader diffuse {
        c_white
        map image {i_image}
    }
    shader mirror {0.1, 0}
    bump_map noise {}
}

OR, using some shader shortcuts…

 
material m_material = {
    diffuse 1.0
    map image {i_image}
    reflect 0.1
    bump_map noise {}
}

This begs the question why a material editor is needed at all, but easily answered with “editing large scenes with lots of materials using a text editor is a pain in the ****” :wink:

Ian.