I want it to look as close to the first image as possible, this is more important than having the hair physically simulated properly. So how can I fix it?
My initial idea is to figure out how to disable the object scale from influencing the hair… but how?
I don’t have any modifiers other than the Particle. The Rotation also didn’t changed what I described in the video. Same with applying the transforms with an empty parent instead of the object.
If I render it frame by frame, it works. If I render it as an animation, it doesn’t. Disabling Motion Blur did not change the issue too. I’ve also tried different file output formats (png, mkv, bmp, etc).
Workaround:
So I made a small script that renders the animation frame by frame manually and saves it to disk. I’ll share it here just in case anyone needs it:
import bpy
import os
# Set the output directory
output_dir = "C:/tmp/monkey/"
# Frame Range:
frame_start = 0
frame_end = 80
os.makedirs(output_dir, exist_ok=True)
# Loop through each frame and render
for frame_num in range(frame_start, frame_end + 1):
bpy.context.scene.frame_set(frame_num)
bpy.ops.render.render(write_still=True)
print(f"Rendering Frame {frame_num}...")
# Saving the rendered image...
frame_num_str = str(frame_num).zfill(2)
file_path = os.path.join(output_dir, f"{frame_num_str}.png")
bpy.data.images['Render Result'].save_render(filepath=file_path)