Start multiprocessing from panel

Hey there :slight_smile:

i am trying to integrate multiprocessing into an addon. This minimal example is working on Linux with blender 3.4.1:

import bpy

from bpy.types import (Panel, Operator)
from multiprocessing import Process
from time import sleep


# list to store jobs
jobs = []

def todo(i):
    sleep(1)
    print("test", i, "done")

def create_jobs():
    for i in range(10):
        job = Process(target=todo, args=(i,))
        jobs.append(job)

def start_jobs():
    global jobs
    for job in jobs:
        job.start()

def wait():
    global jobs
    for job in jobs:
        job.join()
        
    jobs = []


# button to start job
class WM_OT_job(Operator):
    bl_label = "job"
    bl_idname = "wm.job"

    def execute(self, context):
        
        create_jobs()
        start_jobs()
        wait()
    
        return {"FINISHED"}


# simple panel
class mp_test(bpy.types.Panel):
    """Creates a Panel in the Object properties window"""
    bl_label = "test"
    bl_idname = "OBJECT_PT_custom_panel"
    bl_space_type = "VIEW_3D"
    bl_region_type = "UI"
    bl_category = "test"

    def draw(self, context):
        layout = self.layout
        box = layout.box()
        
        box.operator("wm.job", text="todo")


# register and unregister
def register():
    bpy.utils.register_class(mp_test)
        
    from bpy.utils import register_class
    register_class(WM_OT_job)


def unregister():
    unregister_class(WM_OT_job)
        
    from bpy.utils import unregister_class
    bpy.utils.unregister_class(mp_test)


# mainloop
if __name__ == "__main__":
    register()

When running the same code on Windows, this error-message is dropped:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Program Files\Blender Foundation\Blender 3.4\3.4\python\lib\multiprocessing\spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "C:\Program Files\Blender Foundation\Blender 3.4\3.4\python\lib\multiprocessing\spawn.py", line 126, in _main
    self = reduction.pickle.load(from_parent)
EOFError: Ran out of input

Multiprocessing usually needs to be started in mainloop, right? Maybe this is the reason for this error.

What is a possible solution or what would be a common way to do this in an addon? Is there maybe an operator that is able to handle this?

Thanks in advance :slight_smile:

Greetings,
Chris

Maybe there is some problem with reading and writing files for interprocess communication.

Wouldn’t it be better to use ThreadPoolExecutor?

1 Like

Thank you very much for your answer :slight_smile:

Yes, this is a possible solution to use threading and also a suggested method in the docs, when jobs are joingt. But ThreadPoolExectuer is not running in parallel. I have allready tried ProcessPoolExecutor. I guess it must be in main loop also.

https://docs.blender.org/api/current/bpy.app.timers.html

There ist a function called “run_in_main_thread”. Could this be a right direction? Does anyone managed to run multiprocessing in an add-on with other approaches or is there some kind of minimal example?

bl_info = {
    "name": "mp_test",
    "description": "multiprocessing minimal test",
    "version": (0, 0, 0),
    "blender": (3, 4, 1),
    "location": "3D View > Tools",
}


import bpy

from bpy.types import (Panel, Operator)
from concurrent import futures
from time import sleep

# https://docs.blender.org/api/current/bpy.app.timers.html
import queue

execution_queue = queue.Queue()

# This function can safely be called in another thread.
# The function will be executed when the timer runs the next time.
def run_in_main_thread(function):
    execution_queue.put(function)


def execute_queued_functions():
    while not execution_queue.empty():
        function = execution_queue.get()
        function()
    return 1.0


# list to store jobs
jobs = []

def todo(i):
    sleep(1)
    print("test", i, "done")

def start_jobs():
    print("__name__ within start_jobs:" , __name__)
    with futures.ProcessPoolExecutor(max_workers=24) as e:
        #for i in range(10):
        e.submit(todo, 1)
        e.submit(todo, 2)
        e.submit(todo, 3)

# button to start job
class WM_OT_job(Operator):
    bl_label = "job"
    bl_idname = "wm.job"

    def execute(self, context):
        print("__name__ within WM_OT_job:" , __name__)

        run_in_main_thread(start_jobs)

        return {"FINISHED"}


# simple panel
class mp_test(bpy.types.Panel):
    """Creates a Panel in the Object properties window"""
    bl_label = "test"
    bl_idname = "OBJECT_PT_custom_panel"
    bl_space_type = "VIEW_3D"
    bl_region_type = "UI"
    bl_category = "test"

    def draw(self, context):
        layout = self.layout
        box = layout.box()

        box.operator("wm.job", text="todo")


# register and unregister
def register():
    bpy.utils.register_class(mp_test)
    bpy.app.timers.register(execute_queued_functions)

    from bpy.utils import register_class
    register_class(WM_OT_job)


def unregister():
    unregister_class(WM_OT_job)

    from bpy.utils import unregister_class
    bpy.utils.unregister_class(mp_test)


# mainloop
if __name__ == "__main__":
    register()


Unfortunately, proper multiprocessing is not officially supported by blender for addons…

The run_in_main_thread function you mentioned is for use with the threading module, which isn’t true multiprocessing, though it will allow you to run code without interrupting blenders UI (not a good idea if you are modifying blender data though).

The only workaround I can think of is to start a new instance of Blender, and use a startup script to do the work you want. This will allow for concurrent processes, but has the disadvantage of not allowing the transfer of python types, blender objects etc… (though it could probably be done by writing to external files)

It’s a bit sad, but unfortunately, I don’t think it’s going to change any time soon :frowning:

2 Likes

Dear Andrew,

thanks for your reply. I was afraid of an answer like this. In my specific case, it would not be necessary to access blender during the execution of the script. The script is gaining the information for all jobs in advance and is storing them in an object instance. During the execution, the single jobs are storing the information in a multiprocessing dict. After all jobs joined, the dict is integrated into blender again. This is working fine in Linux (Even when not running in mainloop).

Would be awesome, if there is a general support for multiprocessing in scripts. Maybe an approach like this: Blender is starting a pool for multiprocessing of scripts at startup. If a script wants to get a job done, the script is appending the job to this „global“ pool of multiprocessing jobs. (Maybe with a „multiprocessing“ handler)

How do other scripters avoid this issue? I could think of making an export-function to disk. The script could pickle the dictionary for calculation and blender would start a subprocess. But starting subprocess looks also like a hack for me.

Greetings,
Chris

1 Like

Yeah, it would be nice if there was proper support for it, but I don’t think it’s high on the developers’ priority list at the moment…

I don’t know of any addons that use multiprocessing successfully with Blender, but what I think I would do in your case is to try and write the information you need for multiprocessing to a file, and then start a new python process (separate from blender so that it can use multiprocessing), do what ever you need to do in there, and then write the result to another file which the blender addon can then use (I’d look into timers, they’re good for periodically running code in Blender).

It’s not at all ideal, but I think it’s probably the best option that I can see…

1 Like

Thanks for your advice! I will try it :slight_smile:

1 Like

Another quick approach:

script_for_exec = '''
from multiprocessing import Process
from time import sleep

def job(i):
    sleep(1)
    print(i, "is working in", __name__)
    return "ok"

def main():
    global Process, job, sleep
    p0 = Process(target=job, args=(0,))
    p1 = Process(target=job, args=(1,))
    p2 = Process(target=job, args=(2,))
    p3 = Process(target=job, args=(3,))
    
    p0.start()
    p1.start()
    p2.start()
    p3.start()
    
    p0.join()
    p1.join()
    p2.join()
    p3.join()
    
    print("done")

if __name__ == "__main__":
    main()
'''

import bpy
from bpy.types import (Panel, Operator)

class WM_OT_job(Operator):
    bl_label = "job"
    bl_idname = "wm.job"

    def execute(self, context):
        
        exec(script_for_exec)
    
        return {"FINISHED"}


# simple panel
class mp_test(bpy.types.Panel):
    """Creates a Panel in the Object properties window"""
    bl_label = "test"
    bl_idname = "OBJECT_PT_custom_panel"
    bl_space_type = "VIEW_3D"
    bl_region_type = "UI"
    bl_category = "test"

    def draw(self, context):
        layout = self.layout
        box = layout.box()
        
        box.operator("wm.job", text="todo")


# register and unregister
def register():
    bpy.utils.register_class(mp_test)
        
    from bpy.utils import register_class
    register_class(WM_OT_job)


def unregister():
    unregister_class(WM_OT_job)
        
    from bpy.utils import unregister_class
    bpy.utils.unregister_class(mp_test)


# mainloop
if __name__ == "__main__":
    register()

In this way, the process is started in main but with the same error with multiprocessing and concurrent. This is working with the python idle in windows. So maybe this is an issue with the bundled version of python.

Greetings

Why do you need multi-processing instead of multi-threading?
Even multi-threading can use multiple CPU cores.
I think the advantage of multi-processing is multiple Blender instances or memory barriers.

Even multi-threading can use multiple CPU cores.

Sorry, I’m wrong.
Python only runs one thread at a time due to Global Interpreter Lock (GIL) :cry:
https://wiki.python.org/moin/GlobalInterpreterLock

2 Likes

Please don’t worry. Andrew suggested a workaround :slight_smile:
I am trying to solve it this way for now.

Would be cool if it is integrated in future :slight_smile:

The problem with that is the exec function just runs that code as if it was written there, so it’s still being executed in the Blender environment.

Instead, you want to get the path to the actual python.exe file, and then call that with something like subprocess.Popen, passing the string to be executed as the -c command line argument

That will create an entirely new instance of python, that has nothing to do with blender, and that will be able to use multiprocessing properly.

1 Like

Thanks! I will check this out :slight_smile:
The exec was a just a last approach to try it :joy:

1 Like