One way to implement blendshapes would be to create a separate copy of a mesh for each blendshape, and have GPU interpolate between those meshes. Does anybody know if that’s how Blender implements them? Although based on how little memory gets allocated when adding a new blendshape when testing this out I would guess Blender maybe uses a more optimized approach.
There are two types of blend shape, relative and absolute. Relative stores only the vertices that have different positions compared to the basis mesh so is more efficient and also better for blending multiple shapes together. Absolute stores all the vertices for each shape so blending multiple shapes can cancel each other out.
Yeah Blender does not use actual meshes in that same way. I am not sure if Maya still does it like that. But it used to. Blender just stores point data locations per frame. There are no other meshes used or generated to my knowledge.