100 Million Poly Skeleton to Internet -- Workflow suggestions?

I’m working with a University team that is digitizing a large skeleton, and we will be uploading to Sketchfab when ready. This message is looking for suggestions on how to improve my workflow regarding the texture / normals etc.

The scan consists of over 100 bones, all scanned at very high resolution.

The average SINGLE bone consists of around a million polygons, Sketchfab says it’s good to stick to a million polys TOTAL to be viewable on a mobile device.

One successful workflow I’ve managed, is to create a low poly version on a per bone basis (say 1k), and bake the high poly to low poly normals etc in substance painter. When I bring that into Blender it looks really good, and I can then delete/replace the high res bone with the low poly bone with normals.

The challenge though, is that I’m afraid 100 plus materials will slow down Sketchfab. Is that correct? if my model was under 500k, would having 100 different PBR textures (normal / roughness etc for each bone) be a problem? I’ve heard of texture atlases etc, but can one create a texture atlas from pre baked normals / roughness maps etc? A post combine type step?

Given the density of the models, I have to bake the normals on a per bone basis, but I don’t know how / if I can merge those 100+ texture sets somehow to make it more appropriate for real time / internet environments.

Imagine you had a scan of the dinosaur in the embedded image. How might you go about approaching the task of having it look good in a real time environment?

Do you have any suggestions on proper workflow when dealing with huge assemblies like this?

1 Like

See if Sketchfab supports LOD. if they do try to figure out how they stream the LODS.


I wouldn’t be surprised, but best way to find out is to test.

Yup. For roughness, diffuse color, etc, it’s trivial. Can’t get it wrong, almost.

For normal, you can still atlas, you just have to be careful that you don’t rotate the islands. Easy enough to do in Blender via ctrl a pack, which has a rotate checkbox on the operator panel. Multiple object editing makes texture atlasing a breeze, don’t even have to join your objects if you don’t want to.

But of course, you can also rebake your original normals, etc directly to your packed UVs instead. Baking high-to-low directly to packed UVs would probably be better than playing a game of image Telephone (each bake will tend to blur a texture slightly, you don’t want to do 50 bakes-of-bakes.)

1 Like

Hey @ArtStudent!

I happen to work at Sketchfab, glad you asked :slight_smile:

We’re actually developing a streaming/LOD solution for our web viewer, you can learn more about it here:

100 Million polygons will be no problem, and this could be a great dataset for us to test. Please note that this type of model can only contain color texture maps, no PBR - the model needs to be provided as an OBJ+Texture maps. When do you expect to have your model ready?

As for decimating/baking normals: yes, 100 materials is definitely too much. You need to pack them into a smaller number of atlasses - most photogrammetry software will do this for you automatically.

1 Like

Hey thanks for the reply @bandages
Just to clarify: if I’ve already baked the normals, and say I have 100 different normal texture files that I’ve imported 1 by 1 and set up in Blender, can I then re combine to an atlas? like 1 4K, of maybe something with the new UDIM support? The challenge for me baking everything from high to low res at once, is there is way too much computation / too many polygons to do everything at once, so I have to bake the 1million polygon geo objects seperately see?

@bartv thanks so much for chiming in. This is very helpful info!
The model will likely be ready within the month, but one of the main things it rests on is the workflow, which is really quite new to me.

Can you think of any immediate pros / cons between the new streaming tool with just textures vs the more traditional pbr workflow? We’d like the best quality possible with the skeleton, while allowing as many people to view it as possible. In that sense I’m looking for the magic bullet ration of top quality with maximum lightness!

One of the big challenges we’re facing is the sheer density of the model. Given that each bone is a million polygons, I’m unable to process everything to one albedo / normal… I have to do them 1 at a time meaning I have 100s of maps (hence my question above to @bandages

I’m working on the guidelines so far of ‘max 1m poly, and as few maps as possible’, but if you have any more specific suggestions I’d be grateful :smiley:

Neither the low nor the high need to be one object to atlas. I mean, probably, it’s worth joining and separating sometimes just to simplify some operations, but that’s just to reduce the workload.

Here, consider this: join all your low poly bones into a single object. Add a new UV map. Separate by parts. That’s easier than adding a new uv map to each individual bone. But you could script instead.

You can get your atlassed UV with everything separate or with everything joined, doesn’t matter. If separate, select all your low poly bones, enter edit, select all in UV editor, ctrl a, ctrl p, on operator panel disable rotation. (For normals, assuming they’re tangent space.)

With your atlassed UV, you can bake your old colors to your new UV, or if you don’t want a recording-of-a-recording, you can bake selected to active using your new atlassed UV. If you do the latter, you don’t have to care about rotation of islands.