Sharing the same codebase with Maya and Blender

Hi there!

As the title suggests I would love to write some libraries that share some logic in Maya and Blender. Of course all plattform related code would be separate packages so “import pymel.core” and “import bpy” does not collide. But logic that is independet from Maya or Blender would be great to share between tools.

Some things I would like to have

  • Sharing the Logic codebase with Maya and Blender codebase
  • Logic codebase / Maya codebase / Blender codebase in different GIT repos NOT one

What would be a good approach to do so regarding

  • library/project/package/module file structure
  • Pythonpath? In Blender I can specify only one path to python projects. In Maya I can specify multiple locations in the via sys.path.append()

At the moment I have this structure:


I added “…Programming/Blender/lib” to the Blender addons path and the “…Programming/Maya/lib” to the Each “myTool” project has its own GIT repo. That works great so far. What makes things more complicated is when both “myTools” from Maya and Blender would like to import a project/package that is common to both Maya and Blender.
I want to avoid copying packages/modules around.

Any thoughs or ideas to this? :slight_smile:
Thanks in advance!

I use to write code in MAYA, and now write code Blender . . . . they are very different; Maya does not use separate modes like Blender does, and the coding is vastly different. That said, when I speak of coding in MAYA, I am talking about MEL, not Python, I am not familiar with the use of Python in MAYA, I started programming in Blender before that change in MAYA happened . . . . but if it is anything like the MEL scripting was, I would say you face an extremely formidable barrier. I would never go as far as to say that it is impossible, but considering just the fact that Blender Python is still relatively fluid, that is, you find changes in procedures with each new major version, not even taking into account what might change in MAYA, that you would be hard pressed to keep up with all that, even if you were able to achieve your initial goal. That is just IMHO, what I would say based on what I know about both apps.

I guess the first step would be taking several nice tools from one app and port them over. This way you’ll see the feasibility of such a project.
My guess is that you’ll soon find out that if it’s possible, you’ll pay a too much high price in performance loss. When I coded something in Blender, it was always time-consuming to find a way to improve performance, and it wouldn’t be doable with a high-level API.
(I coded addons like BlenderKit or Blender-CAM and about 20 more, but I actually studied art…)
Nevertheless I don’t want to be an idea-killer here, A simple way to solve your problem would be to go simply with a third module that is on the same level as your Blender-Maya modules?

1 Like

Thanks guys for your thoughts!

Maybe I was not able to express my intention very good. :wink:

Actually what I want to achieve is the „idea killer“ you where talking about!

I want to share code that has nothing to do with the Maya API or Blender API but is independent logic like data classes that hold data or manipulate data (like position data etc.), not Maya or Blender related code. Custom SQLite/MySQL ORM, JSON parser etc.

My question is more about how to setup an environment to do so.
Best folder structure, how to point the Python module search path of Maya and Blender to the same folder (without interfering with Blender addons: „fake_module“ error if Blender parses code that is not actually an addon because its for Maya or a simple data class without „bl_info“ dictionary), separating all tools for multiple GIT repos etc.

I think I understand you well, just didn’t answer your question so that we would understand each other :wink:

—Common - your shared code base goes here?

Yes, that could be a possibility.

But I have to point the Blender scripts path to „Programming“ so it has access to the „Common“ code right?
That way I would get the „fake_module“ error, as Blender seems to expect only addons with „bl_info“ dictionary in the scripts folder and there wont be only addons (Maya code, common code).

I would need to point the scripts folder to „Programming/Blender/lib“ and need to append to the sys.path the „Programming/Blender/Common“ globally for Blender (I dont want to append that path in every so it has access to it). But how to do that?
In Maya I can just write the sys.path.append to the and it will be available in Maya. But how to do that in Blender?

You can do that in blender, too. I do it in my add on to enable user site packages:

Ah, well,having this nicely strucutred seems like a hard job to do, regarding to the fact that you want probably keep things in the same dict. I did some tests recently where I needed to split part of the code in another lib (because of multiprocessing, which is also hard to do in Blender).
I found out that you can have addon folder, and a library in addons folder like another library. If it doesn’t have bl_info I think it just doesn’t import, but works as a module if you import it.

So I would need to have a sys.path.append in every tool I write to ensure Python has access to it, right?
That would be a solution but more elegant would bento do that globally for Blender and not just every addon. :slight_smile: But anyway, that seems to be a workaround!

Yes I would love to have everything in one place and not cluttered on my harddrive. :slight_smile: Unfortunately in my testing environment I get a „fake_module“ if I just leave the bl_info out if I put a package on addon level. :frowning:

Additionally I can think of 2 possibilities to get your independent code into your respective add-ons.

The first one would use a build script to embed the common module into each add-on. This way you can have all code in one repository, but e.g. code navigation might not find the corresponding code since it’s organized differently than used.

The other would use git submodules. Create a new repository for the independent code and add it 2 times within the Maya and blender add-ons. This has the advantage, that you can choose freely which version of the independent library you want to use in each add-on. Might be handy in case the common library has an API change, but you want to update only one again first. This advantage however comes with the drawback, that you need to update the submodules on each change of the common library, even if the API didn’t change.

Thanks guy for all your thoughts, ideas and suggestions!

I´ve come up with the following solution in case anybody is interessted. Im separating the package locations from the start on but keeping them on my harddrive toghether not exposing the whole lib directory to maya or blender as scripts/python packages path.

Each package “myTool” gets its own git repo as well as each package in “common”.

Folder structure


Settings for Blender

  • Blenderscript path set in Blender UI: lib/blender/addons
  • Add file to folder C:\Program Files\Blender Foundation\Blender 2.91\2.91\scripts\startup
import sys


def register():

def unregister():


  • Every addon gets this method in its and gets executed
import sys

COMMON_PACKAGES_LOCATION = "M:\\Progamming\\Python\\lib\\common"

def _add_packages_path(packages_path):
    if packages_path not in sys.path:


Settings for Maya

  • add lib/common to
  • add lib/maya to
import sys

Hope it works well during production! :slight_smile:

That would be also great. I just imagined if I have 10 addons that are dependent on a common module and I make significant changes to that module I have to update 10 addons too to work with the changes. Having a build script would let me choose which addon uses the new version if I have eg. a json file for each addon where the version of the common module would be read and deployed to the addon on each build. I have to think about that.