B3.2: writing annotations to texture pixels

Hi all :slight_smile:

I plan to add a feature to my project specific addon.
This feature is quite simple ‘in the text’: send UV-image window annotations to image pixels…

I know how to write pixels in image but have no idea on how to grab annotation points list, nor convert annotation coordinates to UV pixels coordinates.

Does anyone have any idea ? Maybe this has already been done by someone ?

Thanks for your answers and happy blending ! :smiley:

The annotation tool is used in the Bsurfaces (shipped with) addon for retopology purposes int the 3d view and so it can convert them into curves… but the blender API docu doesn’r tell you much AFAIK… there somementioning of bpy.context.annotation_data … but the details are buried deeper ???
So you want to use it in the UV editor… no idea… – Maybe the addon source could give some hints…

2 Likes

my belief ( mostly based on guessing of what devs did ) is that the annotation points coordinate are bastard ( and maybe hidden ) coords. It’s not texture dependent nor UV dependent. Probably dependent on UV window coordinates.
bpy.context.annotation_data holds a GreacePencil instance wich contains a gpencil_data but also a SpaceImageEditor.greace_pencil…

Since i didn’t take time to dig in this yet, i don’t know what is the right data entry :wink:

My 1st step is to grab as most info as possible on this so that i don’t walk wrong paths… Thanks for your infos @Okidoki :slight_smile:

Maybe someone has more infos on this ?

Happy blending ! :smiley:

The problem with Annotations, is that since they are data values and meant to be only viewed in the screen they have no many uses.

One option is to go with the backbuffer render technique, so this way you retain all of the features of annotations (opacity / pressure / weight). You render the viewport showing the grease pencil into a screen buffer. For sure you will setup the rendering to have transparent background and such. Then then you copy the pixels from the buffer and drop them into the image editor.

https://docs.blender.org/api/current/gpu.html#offscreen-rendering

Another much more simpler option, is to use the PIL module and draw again the lines all by yourself, provided that you might be able to loose some features as weight/opacity/color just in case they are not provided through the API.
Very simple and fast technique, if you use only straight lines – but not good if you need exactly the strokes as Blender renders them. This means that if you want exactly 100% the standard features you will only have to render the view into a backbuffer.

https://pillow.readthedocs.io/en/stable/reference/ImageDraw.html

1 Like

Starting point

Getting an annotation by name

import bpy


def get_anno(anno_name, layer_name):
    my_layer = None
    try:
        my_anno = bpy.data.grease_pencils.get(anno_name)
        my_layer = my_anno.layers.get(layer_name)
    except KeyError:
        pass
    return my_layer

anno_name = 'Annotations'
layer_name = 'Note'
my_layer = get_anno(anno_name, layer_name)

if my_layer:
    strokes = my_layer.active_frame.strokes
    for i, stroke in enumerate(strokes):
        for pt in stroke.points:
            print(f"stroke index: {i} point loc: {pt.co}")

annotation positions

2 Likes

Hi all :slight_smile:

Back there with some time for this and some interresting answers :smiley:
Thanks to @const @Okidoki and @nezumi.blend for your answers !!!

First of all here’s a light intro on the topic:
Since long ago, i dream of per-image annotations ( i don’t care about annotations evolution along frames ). For my specific use, i need the annotations to change when image changes in UV-IMAGE view. This is what i did here: B3.2: per-image annotations

As you can see, when i change image, the annotation changes according the image. The annotations i have are helpers for texture wrapping in my non atlased global texture. They tell me on what texel i should go back to the other side of the texture.
Unfortunately annotations do not appear on textures in the 3D view. For this i have to change my texture in GIMP and activate my wrapping-points layer wich looks like this:


You can see the red lines ( wrapping lines ) under the blue blender annotations.

The reason why i want to send annotations to image pixels is that it could allow me to throw away my gimp wrap-points layer ( and its manipulations ) and do all in blender :wink:
Note that this process don’t need to be realtime, nor fast at all. It is allowed to take up to, say… 5 sec to run :slight_smile:

Now it’s time for me to open python text editor and give a try to this :stuck_out_tongue:
I be back soon !

Happy blending !

OMG !!!

is it this simple ??? :star_struck:

hmmmm maybe i should use a more naive and rounded floating point func instead of the Bresenham’s one…
But it is some kind of detail^^

Thanks a lot @nezumi.blend for your precious and simple example :smiley:

Now i have to find a way to reload the original image in python…
And if possible prevent the modified image from beeing saved…

Happy blending ! :smiley:

Replying to myself:

Strangely i have points in the point lists that are (NAN,NAN,NAN)…

How should i handle this ?

Okay, got it: math.isnan(val)

:slight_smile:

I’m done ! the feature works like a charm !
And the pixels modified are just reset with ALT+R for reloading the original image :slight_smile:

Happy blending !

I can see this happening (almost) realtime if you use numpy and foreach_set, would you mind sharing the part where you write on the image pixels ?

sure :slight_smile:
i be back soon with the code :wink:




def Bressenham(destBuffer,iw,x0,y0,x1,y1,color):

    dx = abs(x1 - x0)
    sx = (1 if x0 < x1 else -1)
    dy = -abs(y1 - y0)
    sy = (1 if y0 < y1 else -1)
    error = dx + dy
    
    while True:
        destBuffer[x0+y0*iw] = color

        if(x0 == x1 and y0 == y1):
            break
        
        e2 = 2 * error
        
        if(e2 >= dy):
            if(x0 == x1):
                break
            error = error + dy
            x0 = x0 + sx

        if(e2 <= dx):
            if(y0 == y1):
                break
            error = error + dx
            y0 = y0 + sy
#==========================================================================================    


def ImageWriteAnnotation(dest:bpy.types.IMAGE_MT_image,anno):
    
    iw, ih = dest.size
    
    color = [1.0,0.0,0.0,1.0] # pure RED

    dest_data = np.empty((iw*ih , 4), dtype="f4") # create buffer of the right size
    dest.pixels.foreach_get(dest_data.ravel())    # and send image bytes to it...

    if anno:
        strokes = anno.active_frame.strokes
        for i, stroke in enumerate(strokes):
            first=True
            for pt in stroke.points:
#                print(f"stroke index: {i} point loc: {pt.co}")
                if((not math.isnan(pt.co[0]))):
                    curX = int(pt.co[0]*iw)
    #                print(destX)
                    curY = int(pt.co[1]*ih)
    #                pixelOffset = destX+destY*iw
    #                dest_data[pixelOffset] = color
                    if(first==False):
                        Bressenham(dest_data,iw,prevX,prevY,curX,curY,color)
                        
                    prevX = curX
                    prevY = curY
                    
                    first=False
                            
    dest.pixels.foreach_set(dest_data.ravel())
    dest.update()
#==========================================================================================  




#**************************************************
#
# The class method called at 'send GP to Image'
# button press
#
#**************************************************
class MyWriteGreasePencilToTexture (bpy.types.Operator):
    bl_idname = "carcass.writegptotexture"
    bl_label = "Write GP to pixels"
    
    def execute(self, context):

        imageName = GetImageInUVEditor()
        annotationName = GetAnnotationNameForImage(imageName)

        print("Writing annotation "+annotationName+" on image "+imageName)
        
        img = bpy.data.images[imageName]
        
        annotationLayer = get_anno(annotationName)

        ImageWriteAnnotation(img,annotationLayer)
        

        return {'FINISHED'}

Here’s part of the whole code :slight_smile:

No doubt this could be greatly accelerated. I just simply don’t need it to be faster; i’m okay with the speed of this piece of code :wink:

Hope you like it !

Happy blending !

1 Like

Here’s as an info complement how i use this GP to image feature for helping image cut:

Sorry for the bad quality but 5MB is really small for a good quality video :confused:

Happy blending !

1 Like

Very impressive workflow, I am not 100% familiar with this workflow but it seems that it can bring certain benefits on multi-tiled texturing.

I have been fighting with this idea of multi-tiling for a while now and I can’t figure out a proper workflow yet. Your solution looks like a good solution, where you have a single texture but with a texture atlas.

1 Like

Glad you like it @const :smiley:

This workflow is very specific to my needs: I need my unity3D app ( here ) to run fluently ( 20fps mini ) on mid-end smartphones. In Unity3D, the more you have different materials, the slower is you app.
Therefore it is mandatory when you have lots of objects using the same shader but with different textures to pack those textures in a bigger one in a kind of atlas. This way, all objects use one unique material and UV coords do the job.
I’m not sure but i guess things work the same way for Unreal Engine. What is GPU consuming in a 3D scene is the texture fetching from 3D board memory to GPU caches. And the less you invalidate those caches, the faster is the app ( just as for CPU^^ ).

This workflow has no interrest in blender itself as blender can handle multiple material properly and blender is mostly used for still renders, not realtime ones.

However, i heard blender is able to handle texture arrays ( wich as far as i understood is some kind of texture atlas having same size textures ) and UV coordinates. However i never used it as i’m not sure of the Unity3D <—> Blender compatibility on this point. Also I found no example of this beeing used in blender…

Happy blending !

1 Like

Thanks for the code snippet ! It looks like it’s already optimized well enough :slight_smile:

I think what you’re talking about is UDIM textures which is (AFAIK) a workflow widely accepted in the industry to manage several textures in a tiled map, making one big texture. It looks an awful lot like what you’re doing right now. Here’s a short youtube video explaing how it works in Blender :

A quick search shows this is available in Unity albeit in the HDRP which I don’t think you can realistically use on a mobile phone…

1 Like

@pitibonom Cool, I will keep that in mind just in case I ever need to go with a project so large in Unity.

@Gorgious This is a good idea, looks like they have done some work on this. Good to know that this works as well.
https://blog.unity.com/technology/multi-material-using-custom-inspectors-and-scriptable-objects-for-udim-materials

1 Like