Okay, I was able to modify my script to export the dust particles, but the Alembic workflow suggested by scorpion81 is a better solution, more streamlined.
Only problems is that the materials dont export along with it, or maybe the Alembic import does not support materials, anyone has a workaround for this?
for the materials to be exported, check the āFace Setsā option (and maybe āVertex Colorsā too) in the exporter. But indeed, Textures are not exported (or not re-imported), but the Materials and their assignments to the faces should be. And there seems to be one serious issue with the UVs, they look broken after re-assigning the textures (or the re-assign failed somehow)
I noticed this after testing it myself, after you mentioned it might be troublesomeā¦ Hmmm I should have tested this before even.
Okay, the materials gets exported now, very cool, but the animation from the Alembic file does not get exported to FBX, only the first frame does, I tried applying the mesh sequence cache modifier but all it does is freeze the current frame of the animation.
This said, the Alembic workflow is definitely the more straightforward way to export this type of animation to external applications like game engines and the such, only need to solve that last piece of the puzzle!
Hi,
did you set the endframe in the Alembic Exporter ? Setting it to frame x will make the FM (and all other simulations) run in the background while the exporter runs (may take a little while). Then, after Importing the .abc file in Blender, you should just press Alt A and the animation coming from the abc file should be played back.
Furthermore, the UV issue is a bug in the Alembic Importer, I filed a bug https://developer.blender.org/T53572 here and also provided a possible fix. Will commit this to FM branch anyway later.
Edit: aaah I seemed to have missed the part that the conversion .abc -> ,fbx didnt want to workā¦ hmmmmm
Yes, I did set the end frame, when I import the abc file back to Blender it plays perfectly, the problem is when I export to FBX, only the first frame of animation is exported, it seems that the FBX export does not support the mesh sequence cache modifier, so I am looking for a workaround, but I have found none so far!
It seems that FBX can only store object based animation and shape key based animation, OBJ doesnāt support animation at all, so you are a bit lost on that. It looks like you have to go with my first idea to export particles as shapekeys and reanimate it in the target software. However, I have no idea if this is possible with this particular software at all.
Otherwise you need to convert the particles to real objects and export those. The bad thing is that Blender can make particles ārealā but this drops animations, so you still need a script to convert the animation curves to the new particle objects. Blender can only handle a small amount of objects so if you have more than 20k particles or so you are going to get a bit of lagging problem with this method.
I can already convert regular particles and mesh with a script so they can be exported to FBX, but it does not work with the dust particles created by the Fracture build for some reason.
Hmm, what about the particle instance modifier ? This one duplicates its base mesh across the given particle system. Each particle will be a copy of the original mesh, inside one object. In case iClone can utilize .obj sequences, Blenders .obj exporter can make such a sequence (one .obj / .mtl per Frame) with its āAnimationā option. Same thing you could do with the FM shards (exporting to .obj) . Might be a bit unhandy to have many little files, but might work as workaround.
import bpy
# Set these to False if you don't want to key that property.
KEYFRAME_LOCATION = True
KEYFRAME_ROTATION = True
KEYFRAME_SCALE = True
KEYFRAME_VISIBILITY = True # Viewport and render visibility.
def create_objects_for_particles(ps, obj):
# Duplicate the given object for every particle and return the duplicates.
# Use instances instead of full copies.
obj_list = []
mesh = obj.data
for i, _ in enumerate(ps.particles):
dupli = bpy.data.objects.new(
name="particle.{:03d}".format(i),
object_data=mesh)
bpy.context.scene.objects.link(dupli)
obj_list.append(dupli)
return obj_list
def match_and_keyframe_objects(ps, obj_list, start_frame, end_frame):
# Match and keyframe the objects to the particles for every frame in the
# given range.
for frame in range(start_frame, end_frame + 1):
bpy.context.scene.frame_set(frame)
for p, obj in zip(ps.particles, obj_list):
match_object_to_particle(p, obj)
keyframe_obj(obj)
def match_object_to_particle(p, obj):
# Match the location, rotation, scale and visibility of the object to
# the particle.
loc = p.location
rot = p.rotation
size = p.size
if p.alive_state == 'ALIVE':
vis = True
else:
vis = False
obj.location = loc
# Set rotation mode to quaternion to match particle rotation.
obj.rotation_mode = 'QUATERNION'
obj.rotation_quaternion = rot
obj.scale = (size, size, size)
obj.hide = not(vis)
obj.hide_render = not(vis)
def keyframe_obj(obj):
# Keyframe location, rotation, scale and visibility if specified.
if KEYFRAME_LOCATION:
obj.keyframe_insert("location")
if KEYFRAME_ROTATION:
obj.keyframe_insert("rotation_quaternion")
if KEYFRAME_SCALE:
obj.keyframe_insert("scale")
if KEYFRAME_VISIBILITY:
obj.keyframe_insert("hide")
obj.keyframe_insert("hide_render")
def main():
# Assume only 2 objects are selected.
# The active object should be the one with the particle system.
ps_obj = bpy.context.object
obj = [obj for obj in bpy.context.selected_objects if obj != ps_obj][0]
ps = ps_obj.particle_systems[0] # Assume only 1 particle system is present.
start_frame = bpy.context.scene.frame_start
end_frame = bpy.context.scene.frame_end
obj_list = create_objects_for_particles(ps, obj)
match_and_keyframe_objects(ps, obj_list, start_frame, end_frame)
if __name__ == '__main__':
main()
You first select the mesh used for the particles, then the particles system, then you run the script!
Scorpion and me have looked into it. You have used only the first particle system on the object but the FM helper generates more than one. The easiest way was to pick the active particles system (the selected slot) from the object instead, so you only have to select the one you wish to convert beforehand.
# This script converts a particle system into keyframed animated individual objects.
# Assume only 2 objects are selected, the active object should be the one with the particle system.
# The source object's active particle slot will be used.
import bpy
# Set these to False if you don't want to key that property.
KEYFRAME_LOCATION = True
KEYFRAME_ROTATION = True
KEYFRAME_SCALE = True
KEYFRAME_VISIBILITY = True # Viewport and render visibility.
def create_objects_for_particles(ps, obj):
# Duplicate the given object for every particle and return the duplicates.
# Use instances instead of full copies.
obj_list = []
mesh = obj.data
for i, _ in enumerate(ps.particles):
dupli = bpy.data.objects.new(
name="particle.{:03d}".format(i),
object_data=mesh)
bpy.context.scene.objects.link(dupli)
obj_list.append(dupli)
return obj_list
def match_and_keyframe_objects(ps, obj_list, start_frame, end_frame):
# Match and keyframe the objects to the particles for every frame in the
# given range.
for frame in range(start_frame, end_frame + 1):
bpy.context.scene.frame_set(frame)
for p, obj in zip(ps.particles, obj_list):
match_object_to_particle(p, obj)
keyframe_obj(obj)
def match_object_to_particle(p, obj):
# Match the location, rotation, scale and visibility of the object to
# the particle.
loc = p.location
rot = p.rotation
size = p.size
if p.alive_state == 'ALIVE':
vis = True
else:
vis = False
obj.location = loc
# Set rotation mode to quaternion to match particle rotation.
obj.rotation_mode = 'QUATERNION'
obj.rotation_quaternion = rot
obj.scale = (size, size, size)
obj.hide = not(vis)
obj.hide_render = not(vis)
def keyframe_obj(obj):
# Keyframe location, rotation, scale and visibility if specified.
if KEYFRAME_LOCATION:
obj.keyframe_insert("location")
if KEYFRAME_ROTATION:
obj.keyframe_insert("rotation_quaternion")
if KEYFRAME_SCALE:
obj.keyframe_insert("scale")
if KEYFRAME_VISIBILITY:
obj.keyframe_insert("hide")
obj.keyframe_insert("hide_render")
def main():
# Assume only 2 objects are selected.
# The active object should be the one with the particle system.
ps_obj = bpy.context.object
obj = [obj for obj in bpy.context.selected_objects if obj != ps_obj][0]
ps = ps_obj.particle_systems.active # Only convert selected particle system
start_frame = bpy.context.scene.frame_start
end_frame = bpy.context.scene.frame_end
obj_list = create_objects_for_particles(ps, obj)
match_and_keyframe_objects(ps, obj_list, start_frame, end_frame)
if __name__ == '__main__':
main()
Thanks to both of you, seems to be working, but exporting takes a bit, will get back to you as soon as the exports finishes, thanks again!
EDIT: Been over an hour with 2200 particles, still not done, will let it run for a bit and see!
EDIT: I left it to run all night, still not saved, definitely a problem here, no time to figure it out during the week, so I just might use Unreal to render the whole thing, or maybe Eevee, even though it is very early!
I run into a Problem rendering FM Animation over Network using either Deadline10 or a blender Addon called Crowdrender.
If i render with Deadline, the fractured objects get ādistortedā:
If i render with CrowdRender, it gets not distorted, it actually does nothing at all. There are no shard flying around, no particles being emitted, no smoke etc. just the 3 walls staying there for 250 frames, but in viewport everything works.
Simulation is completely baked. If i just hit f12 it renders fine too. In the viewport it looks like it should.
Hmm, that looks like a serious issue. I would need to look at that blend file, and also investigate how Deadline 10 or crowdrender work. I guess Deadline 10 has either some old FM build online (if any) or they use some kind of linking / appending / proxy (?) mechanism. If it is rendered outside blender, it needs to be converted somehow, or do you just upload your blend there ? Not sure also about crowd renderā¦
Can one test it maybe as free trial version or so ?
Or there still is some hidden issue with baking, converting to objects etcā¦ hmmm, maybe some loading issue, or a manual execute fracture may miss in the automated process of rendering in a renderfarm.
I did not test rendering FM data in renderfarms myself yet, but dafassi did with a few render farms. But we could not figure out back then what the issues were.
Edit: the distortion looks like an older baking problem, where the order of the shards inside the fractured mesh could get messed up. Culprit could be an old FM build maybe. And crowd render seems somehow maybe not recognize the FM, in case the blend is passed to a renderfarm which only has an official build without FM (there nothing will happen). Those are only first shots, i would need to look closer at this.
Hey Scorpion81 thanks for your reply. In Deadline you setup links to the blender.exe on each machine, it is not a service like sheepit, so Deadline itself does not render at all, just sends the blend file over the local network and the rendered frames back. The only part that could cause this is the plugin (submitter) you need to install inside blender, this plugin is provided by Deadline and it creates the job that gets submitted to Deadline and distributed over the network.
Crowdrender is more like Vray`s DR Plugin, it cuts the frame into different chunks, sends the chunks to the Nodes and after they are done rendering, it puts the frame back together (at least thats how it looks like). I used the latest FM Build for all Nodes. Crowdrender is in early development and the developer told me that it dosent work with ābleeding edgeā blender builds (Mantaflow Build for example doesnt work at all right now with CrowdRender).
I found a workaround by converting all fractured objects to Keyframed Objects, but messing around with all the generated particle systems is no fun, so i would love to be able to run it at a more direct way without the need of converting.
We are a bit confused about what the problem could be. I have rendered many videos in multi-system setups with my own network scripts and this usually works. In general we take care that our features are render farm safe.
So I have to ask again for a .blend file causing this error. Please provide also the hash of the Fracture Modifier Blender build you are using which can be found on the splash screen (just to make sure).
If nothing works you could export the mesh as .mdd point cache and use a scene with the FM object replaced by the cached one. This definitely should work but we would rather like to fix this in FM.