 # Converting Particle Coordinates to Matrix?

Hi All,

I am trying to convert the LOC/ROT/SCALE information contained in the .co property of each particle into a matrix_world compatible matrix. The result of such conversion would result in an object being placed at the same position and orientation as the original particle.

I have some code to fetch the .co properties. But I am stuck on how to convert to a matrix?

``````
import bpy

def valid_particle(pa, cfra):
return not (pa.birth_time &gt; cfra or (pa.birth_time + pa.die_time) &lt; cfra)

def returnSampledParticles(scene, psys):
loc = []
rot = []
size = []
birth = []
death = []
for p in psys.particles:
if valid_particle(p, scene.frame_current):
loc.extend( p.location )
rot.extend( p.rotation )
size.append(p.size)
birth.append(p.birth_time)
death.append(p.die_time)
return (loc, rot, size, birth, death)

ob_emitter = bpy.data.objects.get("emitter")
scene = bpy.data.scenes
psys = ob_emitter.particle_systems
samples = [returnSampledParticles(scene, psys)]        # This def takes longer the larger the particle count.
# Loop through the particles.
for loc, rot, size, birth, death in samples:
# Here is where I am stuck.
# Are particle rotations quaternion?
matrix_particle = ob_emitter.matrix_world * returnMatrix(loc, rot, size)

# What would returnMatrix code look like?

``````

I think this is close…

``````
def returnParticleMatrix(particle):
locmat = Matrix.Translation(particle.location)

#particle.rotation    #Quaternion.
rotmat = particle.rotation.to_matrix().to_4x4()

# Particles are uniformly scaled.
scalemat = Matrix().Scale(particle.size, 4, Vector((1,0,0)))
scalemat *= Matrix().Scale(particle.size, 4, Vector((0,1,0)))
scalemat *= Matrix().Scale(particle.size, 4, Vector((0,0,1)))
mat = locmat * rotmat * scalemat

return mat

``````

But the end result seems to be an order of operation failure. Like the particles are rotated then translated instead of translation then rotation…?

The viewport before render:

The result of the render:

If you want them to translate then rotate then you just use:

``````
mat = rotmat * locmat * scalemat

``````

When people build these matrices they usually think of them like:

• I want to translate the matrix:
mat = locmat

• Now I want to rotate it:
mat = locmat * rotmat

• Then I want to scale it:
mat = locmat * rotmat * scalemat

But the the operation seems ‘reversed’ because the final result is evaluated like this:
finalResult = (locmat * (rotmat * (scalemat * point)))

First the point is scaled, then rotated, then translated.

Ok,

I am still having trouble with this. If I use your suggested code I get no particles rendered within my camera view. They do appear in my export file but the numbers must place them behind the camera or out of view.

I basically have location and scale working using.

``````mat = particle_point_mat * group_object.matrix_world * scalemat
``````

This gets me my group participants in the correct location of a particle. This image shows two particles with a whole group deployment.

When I add in the rotation matrix all particle disappear. I am calculating the rotation matrix from the particle system quaternions like so…

``````rotmat = particle.rotation.to_matrix().to_4x4()
``````

Is that correct?
Or do I need some more math because the rotation is a quaternion value?

Basically I just need what I have rotated at the particle_point_mat level.
.
.
.
And as I type the above it finally hits me.

``````mat = (<i>particle_point_mat</i>*rotmat* scalemat) * group_object.matrix_world
``````

Edit:

Removed the comment, I should have noticed the ‘solved’ tag!