I’m working with a University team that is digitizing a large skeleton, and we will be uploading to Sketchfab when ready. This message is looking for suggestions on how to improve my workflow regarding the texture / normals etc.
The scan consists of over 100 bones, all scanned at very high resolution.
The average SINGLE bone consists of around a million polygons, Sketchfab says it’s good to stick to a million polys TOTAL to be viewable on a mobile device.
One successful workflow I’ve managed, is to create a low poly version on a per bone basis (say 1k), and bake the high poly to low poly normals etc in substance painter. When I bring that into Blender it looks really good, and I can then delete/replace the high res bone with the low poly bone with normals.
The challenge though, is that I’m afraid 100 plus materials will slow down Sketchfab. Is that correct? if my model was under 500k, would having 100 different PBR textures (normal / roughness etc for each bone) be a problem? I’ve heard of texture atlases etc, but can one create a texture atlas from pre baked normals / roughness maps etc? A post combine type step?
Given the density of the models, I have to bake the normals on a per bone basis, but I don’t know how / if I can merge those 100+ texture sets somehow to make it more appropriate for real time / internet environments.
Imagine you had a scan of the dinosaur in the embedded image. How might you go about approaching the task of having it look good in a real time environment?
Do you have any suggestions on proper workflow when dealing with huge assemblies like this?