Hello people!
I have a question regarding some Blender compositor functionality and the way Blender handles textures.
I was wondering would it be possible to use compositor output directly as a material texture input. So to say having a node that streams constantly output and bring it into shader editor through let’s say image texture node. That would give all the tools of compositor and let it use as independet texture editor for different render engines.
This brings me to my second question. When modifing textures, we can use viewer node to see output update in uv/image editor. So should we first save out the modified image or could we just read it from memory, as it should be already there. My logic says, that if modified texture would be saved onto disk, we would have to import it again and therefore lose more memory if instead we could access it directly inside Blender without the need to save it out and import again.
I have done hardly any coding before. But I feel like this workflow could simplify and benefit rendering in Blender. Most of the stuff is done with the textures anyway so why not to do it there. Render engine specific operations would still need to be done in shader editor like 3d noises and ao etc
It already kind of works. Image texture node sees viewer node, but it does not send usable output. I also thought of solving this using similar method as the livelink between Substance Painter and Blender
As far as I know, sadly, there is no livelink. (In Blender, Between compositor and image texture node.)
I am puzzled by the question.
There is no grease pencil image data unless you render it. After rendering, it is just a raster image. (Not a Grease Pencil object, which is built with layered strokes.)
But after you render GP, I fear, you are limited to what View Layers could generate.