In Blender terms it should be realtime (nodes) compositing available in the 3d viewport, but in the gray real world it sounds too good to be true.
What I’d love is ‘camera nodetree’ to have some final render camera effects also in viewport, such as distortions, bloom, color balance/grading etc…
I’m having a hard time understanding what it would be used for. Just a faster replacement of the current compositing area?
Yes. That should end-up to be a faster workflow.
Currently, when you are looking at your scene in Rendered preview mode, EEVEE may take a moment to compile shaders and calculate effects.
But when it is done, except for complicated things that needs to be recalculated, it takes no time to inspect your scene from a different angle or at a different frame, enabling/disabling a collection of objects.
Same benefits are expected for viewport compositing.
Some simple things would take no time. And some complicated ones will focus on recomputing only the necessary.
You should be able to make 3D adjustments of your scene that would improve the shot, faster, instead of loosing rendertime to exploit them and, maybe, limiting yourself to 2D adjustments because of that.
In other words, that should not help you with 2D effects retouching final render ; but that should help you with effects based on 3D objects.
Any chance of blending mode type interactions between separate 3d objects instead of only having access to blending modes within material mixrgb node and compositing?
One of the main things blocking using Blender for things that are similar to and as easy as and as fast as After Effects is that the after effects environment is simultaneously capable of some basic 3d manipulation of the objects on its canvas as well as having blending modes.
You remind me that some interactions are supposed to happen.
Currently, you have effects in viewport for Grease Pencil objects.
You should have similar kind of stuff for collections of regular 3D objects.
basically - 2d filters in openGL are almost instant.
composition nodes would need to generate and compile these 2d filters the same way a eevee node graph assembles and compiles a fragment shader.