Brecht's easter egg surprise: Modernizing shading and rendering

Luxology modo has the best and fastest preview renderer. If this is what it’s all about, then these are exciting times for Blender, indeed. Any rough estimate when this is going to be ready? Just curious.

put brecht on 100% on this, it seems the idea. get it done before mango starts.
get all the GSoC '10 into blender svn _ re-evaluate what’s possible with blender and use it for mango.

imagine “Cycles” and OpenCL nodes (why? I hope brecht could pay a visit to J Bakker and get help on OpenCL in blender) … and if/when libmv tracking can be used. droooool.

Ever tried modo? It’s got an unbiased renderer and yet it can separate passes just fine.
http://video.luxology.com/modo/301/video/RenderOutputMasks.mov

Lol, win!

I hope brecht show light groups! like in luxrender, http://www.youtube.com/watch?v=8Wef8pZiW_g

*since he had some features not yet revealed *

I don’t know about you guys, but I would love to have the UVTextures just drop-able from files or within blender to the shader node comp, and just able to hook a texture to “diffuse” or “specular” socket.

and also make it a bit smart, if it’s a UVImage texture or bitmap file, it’s mapping is UV. If it’s a generated tile-able texture from a node it could instead use Generated as mapping …

since brecth states on code.blender.org

There are also issues like UV mapping and texturing workflow that need to be addressed.

I for one would prefere to have it Node based also — it’s gonna be totally insane! the things you could do! and animate regarding Textures and there mapping and influencing the final shader output.

Will there be a discussion about this?

When the guy press the render button the render is complete in seconds in that modo video you linked. That is because in my opinion it is reading a stored photon map in disk.
Great to know that is possible to have render passes.

Nope, it timelapses, and saves 24 seconds of our lives :wink:

Wow. I finally watched the video, and man I am impressed. This is going to be an AWESOME year for Blender.

I was looking at this video of Lumion3D:

Where it says that it renders in real time for great quality and 2 seconds for image for final render quality.

How is that possible?
I think they use something I think luxrender or blender internal don’t use and I would love to comment about so perhaps Brecht will implement it if is possible (in a future revision of the renderer if not just now in this first revision he is going to do).

I think what does that possible is a smart feeding of objects to the renderer. Imagine you want to render an image that has a house in first plane to the left of the image and another house to the right but around 200 meters from the camera. Left house has a total of 200 height pixels and the far right one measures around 20 pixels of height in the rendered image. Both houses are the same geometry and textures.

What luxrender does would be feed the renderer with all the polygons the house has and all the textures occupying a lot of memory for all the textures. The second house would use an instance of the first one (using the same data in memory).

What I think Lumion3D does to achieve that impressive speed of 2 seconds for image is to construct a voxel object in memory like you have when you scan a 3D object where you have a bunch of points in space. So it would store around 200 thousand voxels, each one would have: location, normal in that exact location on the original mesh, color and material properties (like glossiness, transparency, coating (multilayer material)). The original house could be using a texture for the door only around 2 megapixels so you can see it is ridiculously less memory, you have for 10 times less memory the entire house on the left. The right one would be even less memory. Perhaps, of course, better idea would be to create a voxel representation of double pixels height than the object will have in the render (this is for example the rule for textures for better interpolation).

Things must be planned with objects with refraction or volumes, perhaps in this case it would be better to feed the renderer with polygons and textures as normally, because for example if you want to have the sun rays appearing in all its glory as volumetric fingers of God touching the Earth through the clouds, it would be simpler to feed the renderer with a cube that is the dominion and then the material/textures the volume uses.

This thing about making meshes as floating little voxels is very very very old. I remember it was first showed in an Imagina certamen (the Siggraph of that time when all of this began). I remember see a human model and the narrator saying it was a lot of color pixels in space, not a mesh.

So for each frame you want to render you read the object and materials and create pixels in space like if it were a scanned object. Double height of pixels that the height it will have in the rendered image. Then feed the scene with all the scanned objects and perhaps some objects in mesh form too (that use refraction or volumes) to the renderer.

I think that is what Lumion3D does, and that explains the irrisory RAM use (probably; I don’t know that) and of course the speed. I would love Brecht reading this and telling what he thinks. Perhaps he is doing this already with that Suzanne planet he shown in the video?. If not you can call this “new” method bicycle !!! :rolleyes:

Renderer doesn’t have to be a path tracer to be unbiased. :wink:

edit:

I’m pretty sure Lumion uses “game tech”, its more like a game engine…

Yeah the demo is similar to the ones of for example CryEngine…

Still Cycles is far from even being close to a production ready render engine. So people might might want to think twice before screaming that blender will have a render engine as good as XX. But is great having Brecht back, is there anyone you could pull of a thing like that it’s him. But personally I’m looking mostly forward to the Shading system and the render API improvements he mentions.

Cuda is da bomb. It is the future. Modo’s renderer is great if you have the dollars to build monster machines, it can’t hold a candle to cuda performance for the same price. CPU vs GPU = GPU wins.

Guys, please don’t b*tch, Brecht is back and he is pimping our renderer, lets just celebrate.

From working on open movies, I know pressing F12 and waiting minutes for the render to even start is a nightmare, so it’s one of the things I’m trying to address.
Ahh, the memories. :slight_smile:

Quantum processors is the future, not CUDA or OpenCL. CUDA and OpenCL is just things to play until they decide unhide the new technology.

About game engines. Game engines just try to do not calculate things. How you can calculate things if you need to present 25 frames per second at least? What they do is tricks. For example, they paint the diffuse on all the objects and so they have the diffuse pass. Then they paint a pass using textures that have the ambient occlussion backed, a pass with reflections backed, and a pass with glow objects baked. Then they composite the passes and they have a very great image. So it is like a lot of diffuse passes really, nothing is calculated (only some of the shadows). What I explained is absolutely different. And it is old technology. We live in a world where we are feed old technology only changing the envelope (yes pushing it :-).

So Cycles could be used for comercial productions?

Erm… sure, why not? :smiley:

Lumion3d uses the GPU for rendering. If you look at Brecht’s video, when he switches to GPU rendering the rendering times are indeed under 2 seconds.

Interesting, have you looked at the upcoming version of SmallLuxGPU in action? It’s really fast! And runs on all OpenCL hardware, all GPUs, all CPUs, all cores used, regarless of the manufacturer.

YouTube (available in HD): Voronoi bricks - SmallLuxGPU v1.8beta1 preview with Blender 2.5 and Bullet Physics
(Look at the real-time demo after the brief test animation - on a home PC with older hardware :))

Could SmallLuxGPU also be integrated into a Blender window like this, will Blender have an easy to use generic interface for other renderers to integrate like this? Thanks.

Wow… best easter ever.

Just wondering - what about the development of freestyle for blender? Will it be ported easily to Cycles easily or will it have to be redone from scratch again? :confused:

The Lumion3d renderer features just simple game engine graphics with a bit of volumetric god rays, (Cubemap?-)reflections and SSAO. The renderer can’t calculate any sort of GI approximation either (the new Cryengine 3 can do that in realtime!). So this Lumion3D thingy isn’t really what you wanna have as the default renderer in Blender, as it is kinda inferior in terms of realism even to the current Blender internal renderer with its raytraced soft shadows + AO.