How hard would it be to implement Ptex in Blender?

So, I am curious. Ptex got dropped by Blender some years back. I am wondering if the devs were to implement it for Cycles and eevee, would it be difficult to do?

There has been a lot of improvements to Ptex over the years. I think it supports tris now and there are some tools that have been developed for it over time:

ptxtransfer- is designed to transfer texel data when
a model has changed. The change may include mesh topology, but the
position and shape of the old and new versions should align roughly.
The executable attempts to copy face textures for a single Ptex file

ptxview- for viewing the contents of a single Ptex file.
If the file contains geometry meta data, the textures can be visualized
in 3D, otherwise a 2D layout of the faces will be shown

ptxconvert- can be used to convert image data to or from
the Ptex format, including environment cube maps

You can also do flat painting in ptex now.

I feel like Ptex is viable now that high poly count is now more mainstream even in games with nanite for example and renderers are faster now than a couple of years back. Here is a former thread I created some years back: Ptex and Blender?? - #14 by pitiwazou

Ptex seems to have addressed some of those issues. We have udims now in Blender and vertex paint is even more faster now as color attribute. Maybe Ptex can be a nice addition and Cycles X support for it especially with the realistic film project the Blender foundation is working on?

Ptex just doesn’t seem widely supported, never heard of anybody actually using it outside of playing around with it either. It seems like one of these inhouse solutions developed for a very specific way of working that only got some attention because Pixar was behind it and because 3d texturing wasn’t that big at the time of Ptex’s introduction.

Being able to tweak a texture, filter it, view it with ease and use as a base for variations, make it fit to another mesh merely by adjusting UV coordinates - I guess that’s all out the window with Ptex and you have to convert or reproject using these utilities you linked - or start over?

2 Likes

I do not think that the problem is the difficulty of developing Ptex support itself. I think the main issue is that it is not as performant and efficient as texturing a UV unwrapped project.

My impression is that the future goes in another direction: better automatic UV unwrapping. If Blender improved its Smart UV Unwrap and its packing algorithm, something like Ptex would become mostly superfluous. For example, recently, Substance Painter has introduced a new automatic unwrapping algorithm, and it seems really interesting, from what I’ve seen.

So, yeah, I think Ptex is not a problem of difficulty of implementation, but simply of usefulness and relevancy.

Ptex is a better version of Vertex colors. You are not confined to as much polycount as vertices, no seam issues. All you need to do is just allocate texture sizes to the faces and paint. What I particularly like is the no seam issues. Also stores files better. With Udims, your texture files and size increase with assets you use for a project. Ptex requires just one file for storing all that data per asset if I am correct.

If I recall, in the previous Blender Ptex build which is no longer available you had a Ptex node in the shading editor: https://www.youtube.com/watch?v=zFRkwZbYNk8

You should be able to use filter nodes like hue and saturation, color ramp, brightness and contrast nodes on the ptex node. If you need to create variations, have a base ptex paint mesh, for variations, duplicate mesh and paint variations? I do understand some people prefer 2d painting or editing in photoshop but at the end of the day it comes back to personal preference.

‘‘There are several challenges that a Ptex pipeline can throw up, but Disney Animation believes they have addressed these with their in-house authoring tool. Namely, autosize (computing the resolution of each per-face texture optimally using a target texels-per-unit in object space), transferring between models (location-based, texel-to-texel, or face-to-face), copy and paste/mirror and flat painting, something that traditional UV texturing clearly provides inherently.’’
Source:https://www.fxguide.com/fxfeatured/ptex_the_other_side/

Seems you can now copy and paste ptex textures or mirror them too.

Why not stay with UVs?
For Disney Animation’s Brent Burley there are at least four issues with UV that would drive a TD or artist to make the switch (assuming the authoring tool removed many of the other issues):
UVs are a hassle or bottleneck, often times requiring specialist stitching/seaming solutions
UV filter seams, especially with displacements can be a problem
UV distortion
IO difficulty with tiled UVs: separate file per tile = 100x number of textures
Given the nature of seams and UV tiles, Ptex can provide a much stronger filtering environment. Ptex can provide seamless filtering which alone can be a major help to texture artists. Burley also points to the lack of setup time and effort with Ptex, avoiding UV’s wasted space (empty texture areas), and arbitrary number of textures per file as just a few of the issues. Ptex also tends to be implemented in a way that is much more memory efficient not only for storage but also loading – with:
Constant faces optimization (disk, memory, filter)
Fine control over resolution, can differ per texture layer
File format optimized for rendering, reducing file counts, minimizing seeks. e.g. all the textures for a given mipmap level are stored together
No inherent per-face resolution limit; large face textures are tiled as needed.

-On that last point the Disney team currently go up to 8k x 8k in their authoring tool, and are currently testing 16k.
Source:https://www.fxguide.com/fxfeatured/ptex_the_other_side/

I personally handpaint my stuff most of the time so maybe thats why I prefer it. If you use photo textures, you would probably prefer uvs.
Disney’s renderer Renderman handles ptex differently compared to other renderers like Arnold for example:

According to Burley, “PRMan uses shading grids which access textures coherently with minimal contention between threads so a shared Ptex cache works fine; we’ve used a shared cache for all of our films up to Frozen . With ray-traced GI however, texture access can become incoherent and the shared cache may not perform well. In this case, a separate Ptex cache per thread can be used.” PRMan, for example, did not initially have a separate cache per thread but for many years now it has had this important improvement. When Ptex is used with a separate cache per thread,” explains Burley, “there is no multi-threading slowdown at all due to Ptex, just increased memory usage.”

So I think Cycles X needs a a separate Ptex cache per thread for Ptex to run efficiently. I am not sure if that will be easy or hard to implement for Cycles X.

Sorry for the long text.

2 Likes

Also with using UDIMs, won’t the multiple texture files due to size and number per asset take time to load and increase render time? with Ptex you get one file per asset and with a separate cache per thread, rendering time is faster?

I may be wrong though.

Pretty sure that’s not correct. Ptex came from Disney Animation Studios, not Pixar.

1 Like

When I used Mudbox about 10 years ago, it was like very useful in some terms. Mostly because you had something like an “infinite resolution” texture where the more you zoomed the more you could squeeze out details when needed.

The most reasonable workflow in all of this, is to have a very crucial kickstart at the very first initial stage of the creation. Where you have no clue on how to manage your resolutions effectively or you don’t know what you want to do. Instead you just lay down all your elements and focus on creating. Then after at a later time figure out the details (such as resolution, proper view distance, etc) and sort things out and optimize the assets.

However after mudbox I didn’t even see this tech again, not even for fun, it was like the ghost code of github or something that nobody liked it. :slight_smile:

In Mudbox, you could have a UV unwrapped model imported and painted, but during export you have to make a hard choice and decide on a prefixed size. But say for example if actually the whole point in everything is to forcefully export to 4096 it looks kinda you beat the final boss of PTEX, and kinda looks like the entire meaning of technology (adaptive resolution) becomes osbolete. P.S. Not that 4096 is the best or worst resolution only mentioned as an example.

2 Likes

There’s a lot of technology shared between the two. But it was originally Disney.

Main problem with Ptex is I don’t know of it running it on GPU

1 Like

My point exactly. I see no reason why Ptex couldn’t have been used as the storage or raw format for textures the same way high poly sculpting is used as the raw format/storage of surface details for displacement, normal and vertex colors.

It should have been adopted as the some sort of psd file format which eliminates uvs and you can then export textures to resolution format or bake to uvs.
I think part of the problem was that Disney should have released those ptex tools I posted above when Ptex was first introduced. I think the lack of those tools made people not find the tech useful at its initial release. Also hardware was very expensive back then (still is) and we couldn’t use so much high poly as we do now.