So, I am curious. Ptex got dropped by Blender some years back. I am wondering if the devs were to implement it for Cycles and eevee, would it be difficult to do?
There has been a lot of improvements to Ptex over the years. I think it supports tris now and there are some tools that have been developed for it over time:
ptxtransfer- is designed to transfer texel data when a model has changed. The change may include mesh topology, but the position and shape of the old and new versions should align roughly. The executable attempts to copy face textures for a single Ptex file
ptxview- for viewing the contents of a single Ptex file. If the file contains geometry meta data, the textures can be visualized in 3D, otherwise a 2D layout of the faces will be shown
ptxconvert- can be used to convert image data to or from the Ptex format, including environment cube maps
You can also do flat painting in ptex now.
I feel like Ptex is viable now that high poly count is now more mainstream even in games with nanite for example and renderers are faster now than a couple of years back. Here is a former thread I created some years back: Ptex and Blender?? - #14 by pitiwazou
Ptex seems to have addressed some of those issues. We have udims now in Blender and vertex paint is even more faster now as color attribute. Maybe Ptex can be a nice addition and Cycles X support for it especially with the realistic film project the Blender foundation is working on?
Ptex just doesnât seem widely supported, never heard of anybody actually using it outside of playing around with it either. It seems like one of these inhouse solutions developed for a very specific way of working that only got some attention because Pixar was behind it and because 3d texturing wasnât that big at the time of Ptexâs introduction.
Being able to tweak a texture, filter it, view it with ease and use as a base for variations, make it fit to another mesh merely by adjusting UV coordinates - I guess thatâs all out the window with Ptex and you have to convert or reproject using these utilities you linked - or start over?
I do not think that the problem is the difficulty of developing Ptex support itself. I think the main issue is that it is not as performant and efficient as texturing a UV unwrapped project.
My impression is that the future goes in another direction: better automatic UV unwrapping. If Blender improved its Smart UV Unwrap and its packing algorithm, something like Ptex would become mostly superfluous. For example, recently, Substance Painter has introduced a new automatic unwrapping algorithm, and it seems really interesting, from what Iâve seen.
So, yeah, I think Ptex is not a problem of difficulty of implementation, but simply of usefulness and relevancy.
Ptex is a better version of Vertex colors. You are not confined to as much polycount as vertices, no seam issues. All you need to do is just allocate texture sizes to the faces and paint. What I particularly like is the no seam issues. Also stores files better. With Udims, your texture files and size increase with assets you use for a project. Ptex requires just one file for storing all that data per asset if I am correct.
You should be able to use filter nodes like hue and saturation, color ramp, brightness and contrast nodes on the ptex node. If you need to create variations, have a base ptex paint mesh, for variations, duplicate mesh and paint variations? I do understand some people prefer 2d painting or editing in photoshop but at the end of the day it comes back to personal preference.
ââThere are several challenges that a Ptex pipeline can throw up, but Disney Animation believes they have addressed these with their in-house authoring tool. Namely, autosize (computing the resolution of each per-face texture optimally using a target texels-per-unit in object space), transferring between models (location-based, texel-to-texel, or face-to-face), copy and paste/mirror and flat painting, something that traditional UV texturing clearly provides inherently.ââ
Source:https://www.fxguide.com/fxfeatured/ptex_the_other_side/
Seems you can now copy and paste ptex textures or mirror them too.
Why not stay with UVs? For Disney Animationâs Brent Burley there are at least four issues with UV that would drive a TD or artist to make the switch (assuming the authoring tool removed many of the other issues): UVs are a hassle or bottleneck, often times requiring specialist stitching/seaming solutions UV filter seams, especially with displacements can be a problem UV distortion IO difficulty with tiled UVs: separate file per tile = 100x number of textures Given the nature of seams and UV tiles, Ptex can provide a much stronger filtering environment. Ptex can provide seamless filtering which alone can be a major help to texture artists. Burley also points to the lack of setup time and effort with Ptex, avoiding UVâs wasted space (empty texture areas), and arbitrary number of textures per file as just a few of the issues. Ptex also tends to be implemented in a way that is much more memory efficient not only for storage but also loading â with: Constant faces optimization (disk, memory, filter) Fine control over resolution, can differ per texture layer File format optimized for rendering, reducing file counts, minimizing seeks. e.g. all the textures for a given mipmap level are stored together No inherent per-face resolution limitÍž large face textures are tiled as needed.
I personally handpaint my stuff most of the time so maybe thats why I prefer it. If you use photo textures, you would probably prefer uvs.
Disneyâs renderer Renderman handles ptex differently compared to other renderers like Arnold for example:
According to Burley, âPRMan uses shading grids which access textures coherently with minimal contention between threads so a shared Ptex cache works fine; weâve used a shared cache for all of our films up to Frozen . With ray-traced GI however, texture access can become incoherent and the shared cache may not perform well. In this case, a separate Ptex cache per thread can be used.â PRMan, for example, did not initially have a separate cache per thread but for many years now it has had this important improvement. When Ptex is used with a separate cache per thread,â explains Burley, âthere is no multi-threading slowdown at all due to Ptex, just increased memory usage.â
So I think Cycles X needs a a separate Ptex cache per thread for Ptex to run efficiently. I am not sure if that will be easy or hard to implement for Cycles X.
Also with using UDIMs, wonât the multiple texture files due to size and number per asset take time to load and increase render time? with Ptex you get one file per asset and with a separate cache per thread, rendering time is faster?
When I used Mudbox about 10 years ago, it was like very useful in some terms. Mostly because you had something like an âinfinite resolutionâ texture where the more you zoomed the more you could squeeze out details when needed.
The most reasonable workflow in all of this, is to have a very crucial kickstart at the very first initial stage of the creation. Where you have no clue on how to manage your resolutions effectively or you donât know what you want to do. Instead you just lay down all your elements and focus on creating. Then after at a later time figure out the details (such as resolution, proper view distance, etc) and sort things out and optimize the assets.
However after mudbox I didnât even see this tech again, not even for fun, it was like the ghost code of github or something that nobody liked it.
In Mudbox, you could have a UV unwrapped model imported and painted, but during export you have to make a hard choice and decide on a prefixed size. But say for example if actually the whole point in everything is to forcefully export to 4096 it looks kinda you beat the final boss of PTEX, and kinda looks like the entire meaning of technology (adaptive resolution) becomes osbolete. P.S. Not that 4096 is the best or worst resolution only mentioned as an example.
My point exactly. I see no reason why Ptex couldnât have been used as the storage or raw format for textures the same way high poly sculpting is used as the raw format/storage of surface details for displacement, normal and vertex colors.
It should have been adopted as the some sort of psd file format which eliminates uvs and you can then export textures to resolution format or bake to uvs.
I think part of the problem was that Disney should have released those ptex tools I posted above when Ptex was first introduced. I think the lack of those tools made people not find the tech useful at its initial release. Also hardware was very expensive back then (still is) and we couldnât use so much high poly as we do now.
Iâm guessing here since I have no experience with all this, but from what I understand.
ptex is not good for Game engines .
Itâs over kill for architecture work. Mostly boxes?
Itâs over kill for motion graphics.
Itâs is overkill for still pics
So itâs only useful in movies and VFX. Which is a very small sector . This may be why there is no pressure to add it?
The most direct equivalent of PTEX, you would assume is the Megatexture technology as used in âiD Softwareâ Rage game, and many others.
Perhaps you would say that PTEX as PTEX was not popular, but the general design behind the code is the same concept. http://silverspaceship.com/src/svt/
In the same way Blender, can use a âtextureâ as composed of multiple tiles of sub-textures in the same concept.
Now as I just think of it, the actual point in all of these, is that Blender forces 1000% manual management of the UDIM texture, automatically this translates to more time spent and more effort from the user.
What actually made PTEX awesome that it would hide and abstract all of the texture management away from the user.