100 Million Poly Skeleton to Internet -- Workflow suggestions?

I’m working with a University team that is digitizing a large skeleton, and we will be uploading to Sketchfab when ready. This message is looking for suggestions on how to improve my workflow regarding the texture / normals etc.

The scan consists of over 100 bones, all scanned at very high resolution.

The average SINGLE bone consists of around a million polygons, Sketchfab says it’s good to stick to a million polys TOTAL to be viewable on a mobile device.

One successful workflow I’ve managed, is to create a low poly version on a per bone basis (say 1k), and bake the high poly to low poly normals etc in substance painter. When I bring that into Blender it looks really good, and I can then delete/replace the high res bone with the low poly bone with normals.

The challenge though, is that I’m afraid 100 plus materials will slow down Sketchfab. Is that correct? if my model was under 500k, would having 100 different PBR textures (normal / roughness etc for each bone) be a problem? I’ve heard of texture atlases etc, but can one create a texture atlas from pre baked normals / roughness maps etc? A post combine type step?

Given the density of the models, I have to bake the normals on a per bone basis, but I don’t know how / if I can merge those 100+ texture sets somehow to make it more appropriate for real time / internet environments.

Imagine you had a scan of the dinosaur in the embedded image. How might you go about approaching the task of having it look good in a real time environment?

Do you have any suggestions on proper workflow when dealing with huge assemblies like this?

1 Like

See if Sketchfab supports LOD. if they do try to figure out how they stream the LODS.

2 Likes

I wouldn’t be surprised, but best way to find out is to test.

Yup. For roughness, diffuse color, etc, it’s trivial. Can’t get it wrong, almost.

For normal, you can still atlas, you just have to be careful that you don’t rotate the islands. Easy enough to do in Blender via ctrl a pack, which has a rotate checkbox on the operator panel. Multiple object editing makes texture atlasing a breeze, don’t even have to join your objects if you don’t want to.

But of course, you can also rebake your original normals, etc directly to your packed UVs instead. Baking high-to-low directly to packed UVs would probably be better than playing a game of image Telephone (each bake will tend to blur a texture slightly, you don’t want to do 50 bakes-of-bakes.)

1 Like

Hey @ArtStudent!

I happen to work at Sketchfab, glad you asked :slight_smile:

We’re actually developing a streaming/LOD solution for our web viewer, you can learn more about it here:

100 Million polygons will be no problem, and this could be a great dataset for us to test. Please note that this type of model can only contain color texture maps, no PBR - the model needs to be provided as an OBJ+Texture maps. When do you expect to have your model ready?

As for decimating/baking normals: yes, 100 materials is definitely too much. You need to pack them into a smaller number of atlasses - most photogrammetry software will do this for you automatically.

2 Likes

Hey thanks for the reply @bandages
Just to clarify: if I’ve already baked the normals, and say I have 100 different normal texture files that I’ve imported 1 by 1 and set up in Blender, can I then re combine to an atlas? like 1 4K, of maybe something with the new UDIM support? The challenge for me baking everything from high to low res at once, is there is way too much computation / too many polygons to do everything at once, so I have to bake the 1million polygon geo objects seperately see?

@bartv thanks so much for chiming in. This is very helpful info!
The model will likely be ready within the month, but one of the main things it rests on is the workflow, which is really quite new to me.

Can you think of any immediate pros / cons between the new streaming tool with just textures vs the more traditional pbr workflow? We’d like the best quality possible with the skeleton, while allowing as many people to view it as possible. In that sense I’m looking for the magic bullet ration of top quality with maximum lightness!

One of the big challenges we’re facing is the sheer density of the model. Given that each bone is a million polygons, I’m unable to process everything to one albedo / normal… I have to do them 1 at a time meaning I have 100s of maps (hence my question above to @bandages

I’m working on the guidelines so far of ‘max 1m poly, and as few maps as possible’, but if you have any more specific suggestions I’d be grateful :smiley:

Neither the low nor the high need to be one object to atlas. I mean, probably, it’s worth joining and separating sometimes just to simplify some operations, but that’s just to reduce the workload.

Here, consider this: join all your low poly bones into a single object. Add a new UV map. Separate by parts. That’s easier than adding a new uv map to each individual bone. But you could script instead.

You can get your atlassed UV with everything separate or with everything joined, doesn’t matter. If separate, select all your low poly bones, enter edit, select all in UV editor, ctrl a, ctrl p, on operator panel disable rotation. (For normals, assuming they’re tangent space.)

With your atlassed UV, you can bake your old colors to your new UV, or if you don’t want a recording-of-a-recording, you can bake selected to active using your new atlassed UV. If you do the latter, you don’t have to care about rotation of islands.

1 Like

@bandages thanks for taking the time to describe the workflow. I really appreciate it.
I’m also a bit slow and am fully getting it yet, as I will need to read a few times and do some searching before fully getting it.

One thing: I understand the idea of selecting all 100 low poly bones, and unwrapping them all to one UV atlas.

Here’s where I could use some clarification – if the way I was baking normals is by [>importing the low poly to Substance Painter > pointing to high res > baking > exporting maps], does that sound compatible with what you’re talking about? ie. would it just bake to one small sections of the large UV map, and I would then copy and paste from each of the normal bakes to the ‘master’ main UV map?

Sorry if I’m a little slow here :slight_smile: What I know how to do is get ONE good bone baked with normal / albedo etc… It’s the combining that I’m struggling with

Well, it’s not necessarily the workflow, but it’s a workflow.

I have no idea what SP does. If you want to rebake normals, and prefer SP’s bakes to Blender’s bakes, then you might have to look into documentation (or maybe somebody else will happen to know.)

If you don’t want to rebake normals, and use copy-of-copy, then you can use Blender to bake from your old UV to your new UV. (Might be smart to renormalize your normals afterwards.)

1 Like

Of course! I know it wasn’t the only workflow… I meant thanks for suggesting a possible workflow :slight_smile:

Imagine I had a blender scene with 100 low poly objects, with each of those having a material including the color and normal that had been baked from the high poly externally.

Would it make sense to re’bake (as you say) all of those normals onto one or two 4k normal / texture atlases? Does that sound like a realistic approach?

Yeah, of course. As long as your atlas is high enough resolution that you don’t lose too much detail.

I wonder what is the interrest of 1M polys per bone…
For sure i don’t know your application but for visualisation i think it’s useless.
normal maps, height maps can do much more visually speaking than a million points.

If this final work needs to be shared for beeing used by the public, noone will have the platform to show your model, and even if they had… they will close the view after 1 hour for 5% loading ^^

Try to decimate your models and bake them to lower res meshes. You’ll increase the texture space but will dramatically decrease the vertex buffers.

low-poly and usability on base entry platforms is my usual burden :wink: as a small worlds VR designer.

Aside this, IMHO blender is the perfect tool for what you wanna do. but whatever LOD system you use, you’ll have to decimate your meshes.

happy blending :slight_smile:

EDIT: i wanna see this T-Rex move !!! :stuck_out_tongue:

Hey thanks for the reply @pitibonom

I agree 100%, I’m not trying to load 1m poly bones!

I have the high res scan data, and am trying to bake the high res scan data normals to low res mesh (as you say)… The challenge is, I can only do one bone at a time… See I can’t bake 50-100 different objects at once see? (or I don’t know how).

Sincerely i don’t know any way for auto-baking multiple objects in a raw in blender.
If you take a look at my project ( Carcassonne Medieval city ), i bake each wall and each tower ( for light and dirtmap ) one apart another, and by hand. :confused:
I got all control but each bake is quite long for giving this final result for the whole city:

The pic is downscaled and highly compressed and therefore crappy jpg…

I think you gotta work on one ( of the most complex ) bone till you got what you want. And then do the same for all other bones…
Mebe you could ask for ppl help in the python script forum. I’m sure that what you wanna do can be automated with script but i’m too shitty coder for helping you on this :wink:

Hope it helps :smiley:

Gotcha – thanks so much. I didn’t know one could bake from one normal texture to another normal texture… I guess it makes sense like duplicating an image or taking a photo of a photo… just didn’t know. Appreciate it.

@pitibonom wow that’s a really cool image!! What resolution is that? It looks like you’re doing exactly the kind of workflow I need to do. Do you have any documentation of your workflow / process?
What res do you bake the light / maps to, and how do you copy and paste them on to the main map there?

Ah what a really cool project. I’d love to be involved with something like this someday.

First I will describe the ideal (magic and sunshine with unlimited compute power) workflow, and then I will mention modifications and alternatives depending on performance/quality.

Since you’re using Substance Painter for your baking, you can bake multiple separate high poly files down onto a single low poly file. You do not need to have all of your high poly bones in the same file.

  1. What you would do is create your low poly bones, UV unwrap them (don’t worry about packing/organizing the UVs yet, just unwrap them), and put them all in one scene in your 3D program (Blender).
  2. Leave each low poly bone as its own separate mesh and give each bone mesh a unique name with “_low” as as suffix like skull_low, femur_left_low, femur_right_low, rib_left_01_low, etc…
    names
  3. Pack the UVs of all*** the low poly bones together like a jigsaw puzzle/Tetris into the space for your texture, utilizing all space. Your UV islands will look something like pitibonom’s image, but with bones. (Leave a few pixels of padding between separate islands so that they’re not touching.)
  4. Export all the low poly bones in this scene as one*** fbx file (or maybe obj would work; just as long as the meshes stay separated and retain their unique names).
  5. Now for the high poly bones. For each high poly bone, open it in your 3D program and name the mesh the same as the corresponding low poly bone, using a suffix of “_high” like skull_high, femur_left_high, etc. and export each bone as fbx/obj. NOTE: Each corresponding high poly and low poly bone should have the same size, position, and rotation even though they’re in different files. So the high and low skulls should be in the same position as each other, the high and low left femur are in the same position as each other, etc… (In theory, the setting we’ll set in step 8 will mean that you can leave all objects overlapping at the 0,0,0 origin but in practice you might want to ‘explode’ the objects and move each pair to their own non-overlapping position.)
  6. In Substance Painter create a new project and import the fbx/obj file with all the low resolution bones in it.
  7. Hit the Bake Mesh Maps button like you’ve been doing and add all*** the high poly fbx/obj files into the High Definition meshes section of the baker window.
  8. Change the Match dropdown from “Always” to “By Mesh Name”. Matching by mesh name means that you won’t get bake bleeding from different meshes onto one another. Only the high res skull will affect the low res skull, etc… in theory… In practice it might not always work that way, but this setting should be enabled, regardless.
  9. Set whatever other baking settings you want to use (like resolution, antialiasing, etc), pick whichever maps you want to bake, and hit that bake button. (Note: if you are baking Ambient Occlusion and Thickness these have their own Self Occlusion dropdown that need to be changed to “Only Same Mesh Name” as well.)

Viola! Here is the bake result of the above settings with one low poly scene (3 low poly meshes all in the same fbx file) and 3 separate high poly source fbx files (pretend they’re bones instead of primitives :stuck_out_tongue: ) Now you can texture.


(Note: this was baked with the objects exploded and separated instead of all at 0,0,0.)



***Okay so this is the part where your computer maybe melts. I wish I knew exactly how their baker works. IF it only loads one high res mesh at a time, sequentially, to bake its contribution to the texture then I don’t think you need to make any modifications here. In theory if there is no memory leak and if it unloads each high res after it’s done with it then you should be golden.

But if Painter does try to load all the high poly meshes into RAM/GPU at once and your computer can’t handle it then what you need to do is split your low bones into a few files instead of one file.

If, hypothetically, Painter can’t handle 100M polygons but it can handle 50M on your hardware, then you’ll just have to split the work in half and do two bakes instead of one. You only need to slightly change steps 3, 4, 6, and 7.

  • For step 3 and 4 pack half of the bones (probably grouping them into a collection to stay organized) into one jigsaw utilizing all the space and export as one low fbx/obj file and pack the other half into another jigsaw utilizing all space (and own collection) and export as a second fbx/obj file.
  • Then for step 6 and 7 you’ll have to make two Painter scenes, one for each low file, and only import half of the high res files for each bake (the correct ones that match each low file, of course).
  • So at the end you’ll have, say, two 4k textures, each with half of the bones on it instead of one 4k texture with all the bones on it.

If Painter can’t handle 50M either, keep splitting the work load into more bakes (25M, 12M) as needed until it works. Put 1/4 of the bones into 4 low poly files and re pack the UVs to utilize all the space of each texture, then do 4 bakes with only 1/4 of the high poly files at a time in Painter.

In this way you will end up with a much smaller number of textures and materials that won’t overwhelm Sketchfab than if you had one set of textures/material for each bone. Maybe you can do it in 1 bake, or maybe it will take 10 bakes (even 10 PBR materials would certainly be better than 100); it all depends on what Painter and your hardware can handle.

Any reasonably modern gaming/dev computer should be able to do at least 10 million polygons per bake in a baker like Substance Painter, Marmoset Toolbag, xNormal, MightyBake, Handplane, etc… But if your computer can’t handle more than ~1 million per bake in Painter then your only recourse is to do a transfer bake (keep reading).



“Okay, Chris, that’s cool but I already baked all of my textures for each bone and/or my computer just can’t handle more than ~1M polygons.”
Yeah, the above workflow means you’d have to bake everything again from scratch in Painter but it’s the correct way to do things for a small number of textures for a real-time asset. BUT the other posters are correct that you can transfer the existing bakes onto a new set of UVs and you won’t have to do a bunch of copy/pasting of pixels in an image editor. This will be like taking a photo of a photo but if your original bakes are large enough (like, 2k would probably be fine, 4k even better) there shouldn’t be too much quality loss, I hope.

  1. All you have to do is duplicate your existing low poly bones that you already UV’d and jigsaw puzzle/Tetris pack their UV islands together (preferably without altering their shape or rotation, just their size and location if possible) into a new atlas.
  2. and then do the transfer via whatever tool works (e.g. Blender or… uh, some other tools can do that too, like Maya and 3DS Max. Painter cannot. You might be able to do it with xNormal [see further below].)

Unfortunately I can’t help you here with Blender as I’m brand new to it (my experience is all in Maya) so someone else would have to tell you specific steps for how to transfer your textures from the old UV’d low poly bones to the new packed atlas low bones.

One thing I’d like to know is did you already texture everything in Painter? You mentioned roughness and color data which sounds like maybe you did.



There is another reason why you might split your bones into a few different texture atlases. (This applies regardless of if you’re baking from scratch or re-baking a transfer to new UVs.) More textures will allow each UV island to be a little bit larger on its own texture, which means higher quality. 25 bones on 4 textures instead of 100 bones on 1 texture would mean that each bone gets to have more pixels.



Which brings us to xNormal: https://xnormal.net

The UI looks a little bit silly but don’t be fooled, the program is extremely powerful and is widely used across the games and film VFX industries for baking extremely high polygon counts. And it’s 100% free.

xNormal, like Painter, can do your bakes from scratch. It has a section to load in one or more high poly source meshes and a section for your low poly meshes as well. The steps are essentially the same as what I wrote for Painter (so I won’t go over them here) with the differences being in the Baking Options (such as using the 3 dots next to Normal map to specify if you want it to be Tangent space or not).

Also, like Painter, I wish I knew if it tries to load all the high poly meshes at once or if it loads and unloads them sequentially because the latter would let you bake an absurd amount of polygons. Alas, I have not tested this but it might be per-mesh batching? Unlike Painter and a lot of other bakers it doesn’t display everything in a 3D viewport (although it does have an optional viewer) which means it doesn’t have to use any system resources to display the models; it just spits out textures with math.

It also has a much more powerful set of configuration options than Painter, including the ability to calculate vertex colors from your high poly into a texture for your low poly, several other map types and, importantly, re-calculate and bake existing normal maps (and other textures) from one model to another.

If you wanted to use it to re-bake existing textures (this will still be like taking a photo of a photo) you should be able to:

  1. Duplicate your existing low poly bones that you already UV’d and jigsaw puzzle/Tetris pack their UV islands together (with the power of xNormal it should[?] be okay for you to even change the shape or rotation of the islands) into a new atlas and export them as one fbx/obj file.
  2. In xNormal, load that new low poly file into the Low Definition Meshes section.
  3. In the High Definition Meshes section, load your old individual low poly bones meshes with the old UVs as your ‘high res’ meshes.
  4. Use the “Base Texture to Bake” column to load a texture** onto each old bone.
    4a. If it’s a normal map use the checkbox in the next column to tell xNormal that it’s a normal map.
    4b. (Side note: If you’re baking vertex colors instead of a texture you’d uncheck the Ignore Per-Vertex Color column for your vertex-colored high poly.)
  5. In the Baking Options choose “Bake Base Texture” and uncheck any other type of map to bake.
  6. Now bake.

In theory you should be able to cheat and use your old low poly as your high poly but I haven’t tried this technique. Bake Base Texture is technically intended to bake a texture (or vertex color) from an actual high poly to a low.

A big drawback here is that you can only bake one texture at a time per high poly mesh so you’ll have to do a whole bake pass for each map type (normal, albedo, roughness, etc.) you want to transfer which means individually loading each old texture for each old bone for every pass which would be super tedious.



I’m pretty sure Marmoset Toolbag could also bake your existing textures onto a new atlas with new UVs but it’s not free and this post is already an essay so I’ll hold my tongue despite my love for Toolbag.



Bonus section: MeshLab.
http://www.meshlab.net

Latest releases here:

(They all say beta but they’re fine; I’m running 2019 05-01; they haven’t posted a release binary to meshlab .net since version 2016 for some reason [note: the binaries on github always complain about not being the latest version when you start the program; ignore it].)

MeshLab is another free and powerful utility; designed to work with very heavy meshes like yours.

Before you even bake from your raw scan data you might consider decimating the mesh down a bit. Beyond a certain threshold there is no perceptible quality gain to be had from millions of extra triangles and MeshLab is pretty good at retaining details and volume while only getting rid of triangles that aren’t contributing.

Reducing the triangle count of each raw bone scan can help you with your baking. Depending on how low you can go without losing quality you might be able to load more of your high poly bones per bake and save some baking time too. Even a drop from ~1M to ~800k triangles can really help if you’re doing it across 100+ bones (that would be a savings of 20M triangles).

2 Likes

OMG @ckohl_art OMG OMG
I’m just seeing this post now. I’m so sorry I didn’t answer last week.

This is a SUPER thorough, intensely valuable massive chunk of knowledge you’ve shared!!
I’m going to print it out and read it carefully.

Do you have a Patreon account or anything???

Nope. The knowledge is gratis.

1 Like

Well thank you Sir.
you’re a gentleman and a scholar!