Can someone help me understand this a bit

I create a circle:

        segments = 10
        radius = self.radius
        
        #circle= []
        mul = (1.0 / (segments - 1)) * (pi * 2)
        self.circle = [(sin(i * mul) * radius, cos(i * mul) * radius, 0) 
        for i in range(segments)]
        

and use this to rotate the circle and track towards any vector(location):

        points = [Vector(p) for p in self.circle ]
        
        center =  sum((Vector(p) for p in self.circle ), Vector()) / len(self.circle )

        # get cursor location in local space, although it could be any point!
        # this is what we will orient the normal toward
        
        xform = obj.matrix_world.inverted()
        #target location
        cursor = self.volume_snap

        # get direction from pivot point to cursor
        # ultimately, we want diff and norm to be aligned
        diff = (cursor - center).normalized() 
        
                
        # compute normal
        # we do this by sampling 3 verts 10x, keeping the estimated
        # norm that was computed from the verts with greatest cross
        norm = None
        
        for attempt in range(10):
            # shuffle verts to get random sampling
            random.shuffle(points)
            v0,v1,v2 = points[:3]
            norm_test = (v1 - v0).cross(v2 - v0)
            if not norm or norm_test.length > norm.length:
                norm = norm_test
            
        norm.normalize()

        # determine which side cursor is on; might need to flip normal
        if diff.dot(norm) < 0:
            # norm was computed pointing away from diff
            # negate norm!
            norm = -norm
        
        
        # compute the quaternion representing the rotational difference between norm and diff
        # we'll use this quat to rotate the points
        rot_quat = norm.rotation_difference(diff)
            
        for i, p in enumerate(self.circle ):
            move_result = self.volume_snap + (Vector(p) - center)

            rotate_result = rot_quat @  (Vector(p) - center)

            both_results = self.volume_snap + rot_quat @  (Vector(p) - center)

            self.circle[i] = both_results
         

moving and rotating individually works perfect but how would I go about combining them?
Am I doing something wrong with the math?
o7cqRu4jDn

I have not gone through all of your code, but just skimming through it I would recommend creating a 4x4 matrix for the cursor position and desired rotation, and then simply multiplying each point of your circle by that transformation. you’ve got a bit of wonky math happening inside a loop, that’s always a recipe for unexpected results- so it would be a good opportunity to simplify your approach. Also remember that the order of operations matters when composing a transform- translation first, rotation next, scale is last.

edit: sorry, meant 4x4- you need a location, so a 3x3 matrix would not work for you.

1 Like

Thank you, I will try that when I get to my laptop.

So is it just adding .to_3x3() at the end of the cursor position?

well, no- not necessarily. I don’t even know what “cursor” is in this context, from the code you posted it just refers to self.volume_snap and I don’t know where that comes from. I’m assuming you’re raycasting either using scene.ray_cast or bvh.ray_cast, in which case both return a hit matrix result (which is unfortunately misleading- it doesn’t return the matrix of the hit result, it returns the matrix of the hit object). It’s pretty simple to calculate a hit matrix of your own

What I’m suggesting is to create a brand new 4x4 matrix using your desired rotation and location, then transforming each point of your circle using that single matrix rather than trying to do math in a loop. This is actually the benefit of using matrices, it makes calculating things like this very simple and less error prone.

some quick untested code to create a hit matrix from a hit location and normal:

    # assuming hit_loc and hit_normal are coming from valid raycast results
    loc = Matrix.Translation(hit_loc)
    rot = Vector(hit_normal).to_track_quat('Z')
    hit_matrix = loc @ rot @ Matrix.Scale(1.0, 4)

now you have a hit matrix you can use to transform your circle:


# this assumes you're rebuilding your circle list from scratch each time you need to draw the circle- if you're pre-building the circle during initialization 
# you'll want to make a copy first, otherwise you'll be adding in the local transform every frame and the circle will drift off into space.

for i, p in enumerate(self.circle):
    self.circle[i] = hit_matrix @ p

should be as simple as that! again, this is all typed out into a forum and not tested/debugged at all but I’ve done this enough times to probably do it in my sleep by now so it should work :slight_smile:

1 Like

Yes self.volume_snap is actually this so i can get the volume:

 hit2, location2, normal, index, object, mat =\
                context.scene.ray_cast( context.view_layer.depsgraph, location1 - 0.001 * normal , -normal )
            
            
            if hit2:
                self.volume_snap += (location1 + location2) /2
                
                self.volume_snap /= 2

so for hit_normal would it be just the same?

the hit normal is returned from scene.ray_cast, so in your case all you need is location2 and normal.

1 Like

blender_PsSB6ZmgqF

I would need to convert the rotation to matrix right?

yeah, use rot.to_matrix().to_4x4(), pretty sure it will complain about a 3x3 matrix which is what a quaternion’s to_matrix() will return

Okay after trying your solution it didnt want to work even after creating a copy of the points so it doesn’t go all over the place. I was able to find a shorter way to rotate the circle, but why is that mixing the vector with the rotation, it just doesnt work. they work better on their own:
4WtGJ5NpYq

Here is the raw code:

ob = bpy.context.active_object
matrix = ob.matrix_world

segments = 10
radius = self.radius
mul = (1.0 / (segments - 1)) * (pi * 2)

target = self.volume_snap # Vector


loc = Matrix.Translation(target)

DirectionVector = mathutils.Vector(target) 

rot = DirectionVector.to_track_quat('Z')

circle = [ ( sin(i * mul)* radius, cos(i * mul) * radius , 0)
            for i in range(segments)]
        
for i, p in enumerate(circle):
    circle[i] = target + (rot @ Vector(p))


self.circle = circle.copy()

                    

There has to be a way I can just include the rot and location of the circle maybe by adding the target xyz to the circle sin cos values, I can already change the size with the radius. this way I can just avoid the for loop

I then use the self.circle to draw it in the shader

batch = batch_for_shader(shader, 'LINE_LOOP', {"pos": self.circle})
        

You are drawing the circle on the x,y plane using:

circle = [ ( sin(i * mul)* radius, cos(i * mul) * radius , 0) for i in range(segments)]

and then rotating the z axis of the circle onto vector pointing towards the target. This is why your circle is pointing the wrong way. It’s actually pointing the wrong way regardless of whether add the target or not.

You should just be able to draw your circle on the z,y plane using:

circle = [ (0, sin(i * mul)* radius , cos(i * mul) * radius)) for i in range(segments)]

Alternatively rotate the X axis to align with the DirectionVector, whilst keeping Y pointing up:

DirectionVector.to_track_quat(track='X', up='Y')

I haven’t tested it but I’m making the assumption that to_track_quat works like the “Track To” modifier -the “track” argument will point along the vector and the “up” argument will point up (in world space if your vector represents world-space coordinates)