Stop Motion OBJ (OBJ, STL, PLY sequence importer) - v2.1.1

After nearly 5 months of work, I’m proud to announce the latest release of Stop Motion OBJ, version 2.1.0!

New features include:

  • Streaming sequences (This lets you work with very large mesh sequences that wouldn’t fit entirely in memory)
  • Improved sequence import experience (FYI, it’s moved from “Add > Mesh” to “File > Import”)
  • Greatly expanded sequence import options (specify most OBJ and STL import options for the whole sequence)
  • Many reductions in memory usage
  • Automatic mesh sequence naming
  • Many bug fixes and lots of code cleanup

Thanks to all who donated to the project, submitted bugs, and suggested features.

If you find this add-on helpful, please consider donating to support development:
PayPal: https://www.paypal.me/justinj
Ko-fi: https://ko-fi.com/stopmotionobj

1 Like

Hey guys - just discovered your plugin. Thanks for making this.

Does anyone have a documented workflow for how to get this into Unreal Engine from obj-sequence to alembic? Right now I’m baking the sequence using the tool and attempting to export to alembic. When I import in Unreal I get multiple single-frame alembic animations rather than one continuous animation.

:grinning:

Hi David, see this section in the Stop Motion OBJ Wiki for some hints for exporting to Alembic: https://github.com/neverhood311/Stop-motion-OBJ/wiki#exporting-to-alembic-andor-fbx-doesnt-work

It’s not officially supported, but some have had luck using this workaround.

1 Like

Thanks! I’ll have a look!

Hello everyone

I’m trying to export my sequence as an Alembic animation, I can successfully import my sequence and play it inside Blender but when I try to export as an alembic, every single geometry is in the first frame. I’m getting my OBJ sequence out of Grasshopper so I thought I should share the geometry to see if the problem lies with them.

Thank you.

https://drive.google.com/file/d/1kZ8A7pjKP0Jp_qZmiJ6q8jCN2wbnMYZB/view?usp=sharing

Hi @Accretence, see this section in the Stop Motion OBJ Wiki for some hints for exporting to Alembic: https://github.com/neverhood311/Stop-motion-OBJ/wiki#exporting-to-alembic-andor-fbx-doesnt-work

It’s not officially supported, but some have had luck using this workaround.

1 Like

Hi @neverhood311, I’ve followed all the steps that others have described and I’m trying to pinpoint what is preventing me to reach the same results.

I would greatly appreciate it if anyone could give it a try with the OBJ sequence that I have uploaded so that I can safely assume the problem is not with my source files.

@Accretence Interesting. I see what you mean.

However, I tried the same workaround with Blender 2.83.8 LTS and it worked. The alembic file imported as a single mesh sequence, not as all of the meshes on frame 1. If possible, try downloading 2.83.8 LTS and try the workaround again. They must have changed something about the alembic exporter in Blender 2.90.

1 Like

@neverhood311 It worked in 2.83.8 LTS !!! Thank you so much!

2 Likes

I love the way this addin is developing. I have a use case and I am hoping this will fit it:

I am working on a facial mocap product that will produce obj sequences in realtime. The polygon count is low enough and the topology is consistent, so I’m sure this won’t stress the system. The issue is the software also produces displacement maps each frame for details and wrinkles. I need a feature that allows me to stream these maps into a displacement shader for the geometry. I know it’s a long shot, but is it possible?

Hi Jimmy,

Just so I understand, let me describe the use-case that I think you’re trying to accomplish. You have a tool (somehow integrated into Blender?) which captures mocap data and generates meshes and displacement maps in real time. You want to feed those meshes and maps to Blender (via some sort of Stop Motion OBJ API?), then use Stop Motion OBJ to play the sequence back?
Is this accurate? Any bells or whistles?

I just want to get an idea of the scope before I commit to anything.

My tool is not in Blender. It is a machine learning data tool created in Python. It outputs head meshes as OBJ files in realtime. I would like to know if your app can read those files as quickly as the tool can output them along with the displacement map so I can render to Eevee. No bells or whistles. It’s for face motion capture.

I would like to know if your app can read those files as quickly as the tool can output them

In its current form, Stop Motion OBJ uses Blender’s built-in OBJ importer script to read OBJ files. It doesn’t make sense for me to write my own importer. So Stop Motion OBJ is currently only as fast as the built-in importer.
I’ve considered adding a quasi-API so that you can create mesh sequences in Python, rather than through the Blender GUI. In that case I would want the calling code to pass meshes as vertex and triangle data, rather than a filepath to an OBJ file, to make importing faster.

…along with the displacement map…

Correct me if I’m wrong, but I don’t think you can import an OBJ file with a displacement map. You have to import the OBJ file, then manually (by clicking or with Python) create a displacement modifier or displacement shader. Since I’m using the built-in OBJ importer, this is not possible with Stop Motion OBJ in its current state. Again, it might be possible to add an API function which accepts a fully-formed Blender material and applies it to a given mesh in the sequence.

…so I can render to Eevee.

I’m still unclear here. Are you planning on keeping the meshes around after you’ve captured a sequence? Or do you just want to import a mesh, apply its displacement map, quickly render it to the screen with Eevee, then discard the mesh and move onto the next one?

That was the big question. I know I can describe a displacement in the MTL file, but I don’t know enough about Blender shaders (I’m a Maya guy) to know if that is of any use to Eevee. So it’s a no-go on this, so basically a no-go for my idea.

My purpose for using your add-in was just to render the sequences in realtime. I can use the meshes as alembic or USD in my standard Maya pipeline, but I wanted to see if I could adapt it for real-time facial motion capture and rendering. Like I said, I’m not familiar with Blender but Maya doesn’t officially support mesh streaming to the viewport. Oh well, I knew it was a long shot.

Hi guys,

Not sure why but one of my sequences are not loaded into blender correctly. I am getting some empty frames.?
Did you had a similar problem?

Really need some help …

Thanks

Hi @artPen, could you provide some more details?
Which version of Stop Motion OBJ are you using?
What operating system?
Which file type? (obj, ply, or stl?)
Are you doing a Cached or a Streaming sequence?
Maybe send a few screenshots?

It’s really hard to help without some of this information.

Hi @neverhood311
I am exporting .obj sequence from Rhino. I am using Blender 2.91.2 and plug-in 2.1.0
I am on Mac OS.
I tried Chached and Streaming option.

I have 8 diffrent .obj sequence sets each 200 frames. All 7 are good. It is only 8th has problems.

If I try to open .obj files which are not shown in a sequence it is all good they are there. So files are ok

Here are some screenshots

Frame 39 - all good


Frame 40 - some mesh information is missing

Thanks for all the info.

That’s very strange. I’m assuming you’ve tried importing the sequence multiple times? Does it fail to load the same meshes each time?

Also, would it be possible to send a few meshes so I can take a look on my end? Maybe, for example, tree_38, tree_39, and tree_40?

Thank you.
Yes, I will. Let me try something different. I might having problems on Rhino / Grasshopper side. While exporting .objs
Will let you know in a bit…