Alembic I/O.

If anyone’s lamenting the fact that this doesn’t work for volumes, turns out converting your volume to a point cloud and rendering it with the point density texture works pretty well:


Obviously, the recreation isn’t perfect, but it’s a Cycles volume in more or less the same shape as the pyro sim. For this test I used a fairly low number of points, jittered them, and added some turbulence in Cycles. It might be a bit more accurate to not use the noise (in both apps) and use a lot more points with a smaller radius. But consider this the “fast and pretty approximation” method.

On a side note, anyone more experienced with Houdini know of a way to get a mesh from the pyro container bounds? That way we don’t have to hand make a new container mesh in Blender. My google-fu is struggling (a lot of articles on meshing the volume plume itself, rather than making a cube mesh from the container bounds).

BUG EDIT: I spoke too soon! Tried it with a sequence, and it imports into Blender fine, but Cycles seems to want to use the bounding box from frame 1 for every frame. Here’s a test .blend with alembic file: https://dl.dropboxusercontent.com/u/1706676/cycles_point_pyro.zip

The issue with crashing when opening blender were solved reseting to factory settings. but curves still crashes when exporting. https://www.sendspace.com/file/vhelxa

J_the_Ninja - this is a great workflow! Really hope it gets supported as it is the last piece of the puzzle to transfer FX. Currently in many cases you have to render in 2 applications and passes, this would already solve a great part of this issue. Does it also read the color data (@Cd)? So fire and other sims would be possible?

@Bounds - there’s a Volume bound and Bound node which should do exactly what you need. You can also directly link your DOP_I/O Node to a BOX primitive which will give it the bounds of the simulation. This is probably the simplest way. Lastly you can use bbox("…/node", …) expression.

Hey J, cool idea. Q: why don’t you just use OpenVDB?

OpenVDB would be the best of course(for Volumes) but currently it seems very difficult if not impossible to export to a structure that Blender accepts. The pointcloud trick would enable things like Foam on waves, particle debri, some smoke etc. For big simulations it might end up too inefficient and inaccurate. It would be a massive deal to have VDB data read by Blender as then practically any simulation data could be imported to Blender with no reliance on external rendering. I am not sure though if this is a topic for Alembic as .vdb is the best accepted method.

Here’s an interesting video (vdb + alembic): https://vimeo.com/108536087

As far as OpenVDB is concerned, I would like a volume object in Blender that can hold VDB volumes, I already tried implementing one and I just made a video to showcase it (really simple, first implementation):

When saving the file, the volumes are written to .vdb files which are then packed into the .blend file. So you can retrieve them externally by unpacking the .blend file, without the need for an exporter (though it can still be usefull).

Could be interesting at some point to make a thread for this, with some test builds to see if we can implement something decent in Blender here, but it will be unofficial, i.e. it might never make it in an official Blender release (though I know a few developers who like the idea of a volume object in Blender, especially when considering what you can do with OpenVDB).

So, does the example scene export curve or it’s my local problem?

@KWD this is VERY impressive! Your paradigms sound solid and I really hope to see it happen one day. It’s irreplaceable for FX work as well as one of the best modeling tools available. People are just not familiar with it’s capability yet

There were a couple of crashes, should be fixed now.

@cgstrive - Thanks, bound SOP attached to the cache seems to do the job perfectly.

I’ve made a new, better test scene for this, located here: https://dl.dropboxusercontent.com/u/1706676/Pyro_points.zip

For convenience it contains the .abc, although the zip is 168MB as a result (if anyone has a metered connection and a Houdini license and would rather a version without the .abc, let me know). Houdini (indie) and Blender scene files are included. I did not include the cache in H, so if you open the .hip you’ll need to find the filecache SOP in the campfire_dop_import object and either resave or disable it.

I tried importing the heat data as vertex colors, but I was unsuccessful. I’m not sure if this is a problem with the Cd data not importing, or if Cycles’ point density does not want to load vertex colors not present in the base mesh. I did try making a dummy vcol layer called “Cd”, but it didn’t work. For whatever reason though, there bounding box bug in my previous test wasn’t present. Not sure what was causing it, but using the mesh from the alembic (generated from the sim container) seems to work fine and gives the easiest and best-shaped container mesh anyhow.

Exporting the heat data as a second point cloud then merging them in the Cycles shader (as you would the attributes from Blender’s smoke sim) works great, however:


Also trying out using the scatter SOP instead of pointsfromvolume to build the point cloud. Pointsfromvolume doesn’t seem to work for the density grid for some reason, plus it appears to just make a point if a voxel is occupied, whereas scatter takes the volume density into account?

Actually vertex colors are not read when importing a point cloud. I can add that functionality if you want to, but for that to work the layer would need to be a color param (c3f), in your test file, however, the layer is a float param, so it will be ignored.

Maybe we could define some UI to load all the properties in an Alembic archive and let the user assign them to the right place on the Blender object. This is open to discussion, if you guys have any ideas on how to that could work, or the UI could be designed, be my guest.

We could also have some granular data reading exposed in the UI, if that’s interesting to you guys:

Depending on how Cycles allocates textures, storing things in the color field instead of multiple point clouds might require Cycles to use significantly more memory. At least assuming point density node with no color data only spawn a single float texture each. Not sure if Cycles actually does that though, or makes one vector4 texture for all the data. So this might not be the way to go anyway.

In general i am all for giving user options. MeshCache modifier would not exist in a big modifier stack so bloating UI should be fine. Further more people who work with alembic are a bit nitty-gritty, so that’s another argument for adding it. Regardless it would be good to have @Cd or even a custom attribute that can be derived from geo (if cycles can see it with Attribute* node). As the PYRO example shows, they can be invaluable.

While on the topic, would fetching only vertex data (point cloud) allow for faster viewport playback. Currently I am dealing with several 5-12GB alembic caches in the scene (Thanks to Your amazing work!), but playback obviously takes a beating with even one of them showing. Perhaps there is an elegant way to skip some data reading that would translate in performance. Tnx

Hi, updated windows build with latest changes from branch, hash bc44d31.

– PACKAGE-INFORMATION –
Branch: alembic_basic_io
Revision: bc44d31
Submodules: locale 4c06104 addons 84b69e6 addons_contrib 7347cec
OS: GNU/Linux, Architecture: x86_64, GLIBC: 2.19
Builddate: So 24. Jul 12:59:55 UTC 2016
Filesize: 73711092 byte
Shasum: 9c1176c0c837707a834d2ad9a1a4024d7d221eba
URL: http://www.jensverwiebe.de/Blender/blender_alembic_basic_io_linux64_latest.tar.xz

Jens

Niice update! Curve exporting now works perfectly! Thanks.
One question. Does alembic exporter support animation of spline beveling factor as in the last scene example?

OSX updated as well.

– PACKAGE-INFORMATION –
Branch: alembic_basic_io
Revision: bc44d31
Shasum: dd8c520d0c435d52adaa761afd3e9a2f35861345
URL: http://www.linesofjasper.com/blender/blender_alembic_basic_io_osx64_latest.zip

jasper

Finally I got around to do some tests myself. Unfortunately I have the following problem. I exported some animated shots from an old project done in Maya. As I still have a Maya 2013 license lying around, I exported with that old version. The alembic files have HDF5 compression, so I converted them to Ogawa with abcconvert. When I import them in Blender most of the shots (but not all) have a timing issue. The frame range is set to 1 - 5 and the animation plays way too fast (I can manually fix this by using ‘override frame’ and then the animation plays fine).

At work we use Maya 2015, so I was able to import the alembics in Maya 2013, 2015 and also Houdini. There the timing was fine, for the HDF5 ones and the Ogawa ones. And in Maya also the timeline was set correctly (for one shot the timeline even started on a negative frame, which was no problem at all). If I can find the time I will also try to export from Maya 2015 (directly with Ogawa compression) and see if the problem still persists.

Here are two shots. One with the problem (shot 04) and one without (shot 35b).
Shot 04 should go from 1 - 13 and shot 35b from 1 - 95. (Don’t mind the inverted normals, that is a modeling issue… :o).
http://www.linesofjasper.com/blender/shot_04.abc
http://www.linesofjasper.com/blender/shot_35b.abc

No, it only stores basic data (control points (not the handles!), knots, order…). We could make it work by using some custom properties export, but that would be for another patch I guess.

I don’t see any major timing issues, though there are a couple of minor bugs I’d say. shot_04.abc is 1 frame off (range is 0 - 12 instead of 1 - 13), and shot_35b.abc’s range is a few frames off (0 - 91, instead of 1 - 95 as noted, maybe FPS is wrong?).

Just to confirm, I see exactly what Kevin sees too. Imported in Modo as well and everything is the same. Maybe it’s your build?

EDIT: I’ll try your OSX build when I get home later.