UV projection mapping

I’m not sure what to call this. Rhino is actually the only program I’ve seen that has this:
http://docs.mcneel.com/rhino/5/help/en-us/index.htm#properties/texturemapping.htm#ApplyCustomMapping%3FTocPath%3DCommands|Alphabetical|A|_____15

So 3dsMax has the reverse. You model something super detailed, then model a simplified version of it, and then look to the other object to get detail. The simplified object has to be unwrapped properly.

What the Rhino one here is doing, well for one it’s over multiple NURBS surfaces so it’s Super awesome, is projecting to an object that’s been unwrapped. It then takes those UV’s and applies them.

It’s basically the same as camera mapping, only, it’s from one object to another. You can have one object Crazy detailed. Then you model a simplified version, UV unwrap that. Then tell the complicated model to project from it’s normals to get it’s UV position. It’s pretty genius. Similar to camera mapping, your complicated object, needs to have pretty tight, even tessellation or you get some odd results. But this isn’t for game models. These meshes are crazy tight. It would be Terrible to try and UV unwrap them.

I don’t know what they might be calling, if it’s something that exists at all in Blender.
Anyone have an idea? It is a really powerful tool, and it’s useful both for going simple to complex, or complex to simple. You can also do multiple objects to a single object, which is also useful. Doing mapping this way also helps keep your UV unwrap placement sized appropriately, which is something I’ve noticed automatic mapping methods usually fall a little short with.

https://docs.blender.org/manual/en/latest/modeling/modifiers/modify/data_transfer.html

Edit: don’t actually watch this video, watch the one @kalamazandy linked below.

1 Like

Well, that’s a LOT of information. But it’s the correct direction, so thanks.
That video, however, not very helpful.
I’d use this one: at 22:10
Head to 22:10 of the video for how to use data transfer for uv map.

Mine was Very different from the geometry, so I had to switch to Projected Face Interpolated.
And you Do have to have really dense geometry. Mine came from Rhino, so when I exported, I adjusted the maximum edge length to pretty much fill the geometry in order to get my form.

So for those of you interested, try out the video I linked to. The guy is MUCH easier to understand. The one SterlingRoth sent was actually just a guy stumbling through ALL of the settings, and also breathing heavily on the microphone so you’ll go crazy if you try and listen to it.

So in my case, settings were:
Face Corner data, enable UVs.
Change to Projected Face Interpolated. And yay, projected maps.

Haha, sorry for the bad video, I just wanted to find something to point you in the right direction. I didn’t vet it very carefully.

hey, no problem. One of the biggest troubles when trying to learn a new package is always, “what the heck are they Calling this.”
And that’s especially challenging with blender.

I’ve switched to 2.8, and things are Constantly kicking my butt. And I have Very little knowledge on how to even search for things. Like, when modeling, I used to create models by Ctrl+clicking in edit mode to create lines, then F to fill (or e to extrude if they don’t connect). That’s gone now!?!? So instead, I’m just rotate snap aligning a plane (which was also difficult to find where they changed that option), then extruding repeatedly.
I saw the polybuild tool, but not sure why that one would be better than the previous workflow. It definitely seems worse.

Anyway, I’m getting there…slowly. Every time I start to grasp a workflow that would work, something changes, or I have to work with other things for a while, then I come back and I’ve forgotten hotkeys and terms.