Blender 2.57 and python 3.2

Hi all:

Some days ago I decided to learn some python and Blender was a great starting point. So I started working on python scripts for Blender. As I am a physician, most of the work involved numerical simulations.

Then I discovered that two very important tools for python are not working in python 3.2: numpy and psycho. This impressed me a lot because I understand that many operations in Blender need lots of fast numerical calculations.

Just for the shake of discussion, can someone comment on this?



You raise 2 huge questions with your post

  1. Why python is slow

  2. Why my python library does not work for my python

Lets start with 2) . In order for a python library to work with blender python it will have to be compiled with the same version of python used by blender in case of blender 2.57 , python 3.2 . Also the library which is in most cases a bunch of py scripts and/or DLLs that contain the C/C++ code should be installed in the blender python directory and not in other python installs of your system or else your blender python wont see the library and complain that it does not exist.

Now back to 1) as you say you need loads of fast numerical calculations. But fast itself is something relative, its impossible to measure the speed of python and provide an overall estimate as python speed varies from case to case. Also the biggest reason why code is slow, is because it has not written in a way that takes full advantage of the language. You have to keep in mind that in many casse python sacrifices speed for ease of use, this is one the policies of the language, does not mean however that python is incapable of high speeds. Its just a choice. So its prefferable to have some deep knowledge of the language and its standard library before utilising libraries that can make your code unecessarily complex. Actually even with no numpy and psycho , python will suprise you with its speed in some cases.

The reason usually why python gets slow has to do with the fact that python does not force (like C/C++ does ) the defintion of type in variables, unfortunately that task falls on the intepreter that may take considerable time to figure out the type of the variable. Usually this is visible in very fast loops that go close or bellow a millisecond.

However , and this is the big trap that people new with python fall into even experienced programmers, that does not mean that python is inherently slow. If python is calling C/C++ code (via compiled DLLs which python is doing alot ), like it does in the case blender API, then it will execute the c/c++ code as it is thus you will enjoy the speed of C/C++ language.

So to summarise, when you make direct calls to the blender API python executes in C/C++ speed because that is what it calls, if however you do complicated calculations using python variables nested inside loops that cycle at least every millisecond , then you may experience some slow downs. In that case using numpy will help you replace python variable with constructs that are generally faster . Psycho may or may not help. While C wrappers like Cython will give you full access to the C speed because it allows you to define python variables with types without writing a line of c code. Cython code then automatically is converted to c code that it gets compiled to a dll that is called like any regular python module .

…I wish there was a gallery for helpful / thoughtful threads.