Blender GPU render in material editor.

I really believe when we’re editing materials in the material editor that the preview image should be rendered on the fly with GPU. Wouldn’t this speed up shading by crazy amounts? Could you imaging changing colors and instantly seeing the rendered result?

The whole scene doesn’t have to render, just the sphere. Yes it shows the materials there, but it’s not completely accurate.

It would be awesome to incorporate GPU rendering into blender itself, but the material editor/node editor would be sick :slight_smile:

And where would be the difference or the deeper sense?

If you GPGPU render the material preview you need a GPU accelerated render engine in the first place.
If you just make the material preview a separate GPGPU raytracer, who would want his materials in the preview look different than in the final render?

Or in other words if you start to replace the functions from the material preview with GPGPU code you might as well start by recoding the whole raytracer.

However using smallluxgpu you can just make a scene with a sphere and use the the life feedback to edit your materials. I guess that`s what comes close to your request :wink:

I wasn’t aware that a GPU render would look any different than using a CPU render. How would that change the way it looks? I though the only difference between CPU/GPU is the processors it’s using.

The reason I see it being nice is because we could see exactly what our materials will look like in the final scene without rerendering. As the way the material looks in the material editor usually isn’t what it ends up looking like in the final scene.

From what I gather the advantage of using the GPU to calculate realtime materials is that it’d be a lot more realtime than calculating it with the CPU.

Maybe I’m completely wrong with this suggestions and it’s pointless to add; I don’t know a ton of stuff that has to do with the whole render process.

Oi. My bad. You´r right, the look would not change, you would simply replace the calculations to be done on the GPU. My mind was a leap ahead in redoing BI GPU based :smiley:
Enhancing/revamping BI in one go with implementing GPGPU starting with the matprev… that would change the looks.
Monday pre-caffein fail =) Stupid snowplow got me up early.

I don´t know the code for the material preview, but I guess its the same as BI uses, so from my POV it would only make sense to revamp BI and along with it the material preview, not vice versa…
On the other hand material preview might be a nice sandbox to start with. It depends if the material preview uses the same functions that BI calls or if it has seperate functions.
You got me nosy now… I think i´ve to look into the code =)

But there are plenty more things I want in Blender first or fixed.
With the render engines it is like with shoes. You need them to walk outside and depending on if it is wet, hot, cold or snowy you got choose the right pair. =)

Hooray for the freedom of choise.


Was that inspired in some program? :slight_smile:

I do believe that the BI render engine needs revamped. Here’s what I’d propose for it: Select wether you want to use the CPU/GPU to render. Also be able to render an unbiased render (I believe that’s the terminology for it. Basically brilliant photorealistic renders).

It’s probably a longshot at best, but it’s something to keep in the back of the dev’s minds.

We can roll the good’old BI discussion here =)

Basically, for Animations I use BI, its kickass for that and you can tune your frames, especially for stylized animated stuff to render in ~5-10 seconds.
If I want kickass photorealisc renders I use a unbiased (while most are biased somehow anyways) thats the grand idea of renderengines. I opt to put more focus on blender exporters for render engines and let BI be what it is… maybe enhance it, optimize it, but don´t make it an unbiased physical raytracer.
What if BI internal gets phyisical correct and unbiased?
What biased scanline raytracer can we use then?

Sure I miss caustics sometims or other stuff, but there are ways around it in animations and for the photorealistic I use octane or smalluxgpu.

Off-Topic: Discuss things not related to Blender or Computer Graphics

Wrong topic Killer :wink:

I suspect that we will always see the CPU and GPU-based rendering systems remaining at some arm’s length from one another. One issue, of course, is that not all GPUs are the same … not by a long shot … and that they are highly optimized for sending their output downstream, to the video display, not back upstream, as an image-raster for the CPU’s consumption.

It would be quite difficult indeed, and very disruptive, to try to re-work such a “feature” into Blender’s massive, existing code base. And I have serious doubts that it would pay-off even if it somehow were successful.

I’m fairly confident, though, that “render nodes” will provide an optional solution. A node, of course, is a well-defined part of the greater whole. It never attempts to do everything for you. Also, it may be presumed that the people who wander into this territory of Blender probably know something about what they are doing. :smiley: Which means, they can work with the limits of each such node, and can make appropriate choices as to how, when, and whether to deploy it.

And there will also be dedicated GPU-based renderers, e.g. Octane, that people can export to.