I was thinking of an idea recently where you would UVmap an object, then paint it and see realtime displacement using a tessellation shader (like the existing ability to directly see bumpmaps as they are painted on, but more sophisticated). I’m also thinking in the future if we get support for vector-displacements, that you go into sculpt mode and such a texture is created as you sculpt (which would be a modern method which avoids the drawbacks of multires sculpting, not the least of which is the limited ability to edit the mesh afterward).
Perhaps (this is just wild thinking), that we could potentially not need a UVMap first because the texture would be generated in the PTex format.
I’ve talked to Campbell and Antony about this in the past. Both agreed that it’s great to have (as anyone who has used ZBrush knows), but it’s not a trivial change to make. Sculpting and painting use two completely different mesh representations and storage schemes, as well as different acceleration structures. Getting this off the ground with a good design would require a major refactor of one or the other.
Probably easier to merge sculpt with vertex paint than UV textures, no? You can always bake the vertex paint to texture after the fact. I don’t know if it makes it easier to merge code-wise, but the PBVH GSoC project seems to get Blender closer to Zbrush’s Polypaint.