in material editor (node editor), i’m trying to use multiple image texture and multiple uv. But having problem whenn trying to combine them, i use mix node , but i can only input another image to be used as masking or alpha channel. I want to be able to create the mask within the blender using rotoscope. So i can do paint to create the masking, but how to use roto as mask? the mask/roto only can be loaded in compositing , but not in material node editor. This shoulb be made available in material , so i can easily isolate or do organic mask to the image texture.
So far , to achieve this, I have to first sent the image texture to the compositing then do the masking and merging and sent back the output to the material. This is a slow workflow. I should be able to access roto inside the material editor or even better to have a ‘compositing’ node inside the material editor so i can do compositing right inside the material editor without have to jump back and forth between compositig and material editor.
Masks are black and white images. You should be able to mask materials just as well with an image texture as you can with blender’s compositing masks. In fact, it makes more sense to use a texture that is UV mapped to your model because you can then directly paint the black and white mask on the model in 3d space instead of having to guess how the mask will look before it is used on your model.
Are you looking for something like an animated mask? You can animate and render the mask sequence in the compositor first, then use that sequence as a mask for your object.
THanks, yes, the only way is to render mask in compositor then send the output to material editor. But i think this should be done more easily by taking the mask directly into material editor thus removing additional step to bring mask to compostor and rendering. The mask should be converted to bw image and should be able to be brought directly into material editor and it’s still editable , rather than create mask first from compositor or from external image editor. This should improve the workflow. So if you change it you don’t have to re-render the mask again.
The other problem with bring mask from compositor is when we’re mixing two different resolution. It will use the wrong reference. as all the mask will be resize relatively to the mixing resolution rather than to have independent absolute resolution.
Yes, but this should only be necessary when you want a video sequence for a mask. Like basically using a series of images to move the mask as the animation plays. Blender’s compositor’s masks are built in 2d space, so using that interface to build a mask is harder than just painting the mask directly to the model in the 3d view.
Also, i just realized you said that you were using multiple UVs for one object. Could your issue be something like the one this guy was having on this thread?
If that is the case, I’m pretty sure you could use the alpha channel of the second image. You don’t even have to make a mask. You have to make sure you set each image to clip and drag the UVs you don’t want used in each UVmap outside the bounds of the texture.
If the image already have alpha channel embedded then no problem. IN my case , i don’t have that alpha channel in my images, it’s raw, then you have to use image editor such as photsohop etc.
Now imagine if i can use mask in material editor , i can just simply create it inside material editor without photshop etc. or render in compositor . … and the mask/rotoscaope is ‘alive’ , if i want to modify the masking /alpha , i can simply re-edit the mask. No need to re-render or go back to photoshop.
Or maybe simply the ability to pipe the output of compositor directly into material editor, so no intermediate render needed.
I think that maybe I don’t understand your use case. If you are working with static images, then why can’t you make a new UVmap and a blank texture for your mask and paint it directly on the model in texture paint mode? Are you just working with planes or cylinders? If the model you are making the material for is complex, then using blender’s compositor masks would be really difficult.
If this is for a static mask then you can just paint black where you want to erase, and shades of grey to soften the edges. If you really do want to make an animated mask, then I do think that would be a cool feature. However, the best way to implement it would be to stick the curve points of the mask on the surface of the model, animate it, and have blender automatically treat it as a flat texture using a UVmap you tell it to use. Maybe they could add this feature to the grease pencil or something.
I think it’s related to the translation from 3d space to 2d space. Like I said, they’d have to give us a new type of vector mask mode in order to make that intuitive. It would have to be a thing where we place the vector exactly where we want it to appear on the model. I think most people would have a hard time figuring out where the white areas of the mask will end up on an unwrapped 3d model.
I suppose you were exporting an image of the UV layout, then re-importing it into blender. You could kind of use the compositor’s masks on that image, and render out a mask that way. If they were to improve this workflow, they would need to implement it as a WYSIWYG style vector workflow. That would be even easier to use.
Also , one of the benefit of using roto/mask rather than paint is i can easily create non organic shape such as circle , square etc. FOr example if you 're masking a logo with a square shape , it’s far more simple by using mask/roto. If you want organic or free shape , then hand painting can be more fun.
I’m not trying to do roto on 3d space or want my vector mask/roto to be appear in 3d object. I don’t even care the placement in 3d at this point. I just want to do 2d masking , just simply mask the input texture /image. or simply like doing mask/roto in photoshop , no 3d is involved at this stage. Just like in image editor like photoshop etc, it just rasterize it realtime to be a BW bitmap .