new composite node request

hi, i hope i’m posting this in the correct section of the forums?

i’m working full time on a series of psy ambient trance music videos, for a very popular youtube and reverbnation musician, and there are a couple of effects i would like to do in blender that i’m not aware are currently available.

the first one would be a composite node which would pinch and magnify the image input’s pixels based on a factor image or animation. white values from the factor would pinch/pull the image pixels closer together, and black values would magnify/push the pixels apart.

if anyone here would be so kind as to code this, i would be using it a lot, and would credit them in video’s outro.

here’s a couple of test renders as an example of the type of work i do. please note they do not show a mock up of the effect i’m requesting, but do use some composite displace nodes.

http://i2.ytimg.com/vi/egl80hHeGUo/mqdefault.jpg

http://i1.ytimg.com/vi/DwbhXrAnX0o/mqdefault.jpg

you can use UV distort for effects like this

cool, thanks very much. i’ll look into that.

i can’t find any documentation on the ‘uv distort’ node, and it doesn’t appear to be in 2.63a, is this node in 2.64a?

i’ve been too busy rendering constantly over the last few days to try 2.64, but does anyone know what ideasman was referring to?

Yes you can unwrap a plane that is subdivided, then use that UV projection as a render layer from the 3D view. This provides distortion co-ordinates in the compositor, use the UV map node (in distortion group) to alter an input image.

Here I knocked up a quick tutorial.

thanks very much, and i had read a little about the “map uv” node, just hadn’t realized it could be used the way you did, which looks cool.

to quote from the wiki:

“Thread the new UV Texture to the Image socket, and the UV Map from the rendered scene to the UV input socket. The resulting image is the input image texture distorted to match the UV coordinates.”

can this is animated? for example can i render an animation from one sphere mesh with it’s uv map enabled in the render layers, and use another deformed sphere to distort the original render?

i’ve re watched your video tutorial in 720p and i think i understand it better. it looks like your only using one set of uv coordinates? and those just need to match the dimensions of the original render, yes?

Yes I just unwrapped the target plane once and that became the only UV set in the render. You should set the size of the plane first, otherwise the unwrap may not work correctly. Also I scaled the image first so that it would fit the plane correctly, i’m not sure if this is strictly necassary.

Keep in mind that the image may be very coarse along the distortion, I’m not sure how to improve the image sampling to fix that, it seems aliased quite badly. Perhaps more subdivision modifiers would help.

EDIT:
The answer to the aliasing is to add another Subdivision modifier, set to Catmull-Clark. The crease the edge (shift-E key) of the outside ring so that the corners remain square (not rounded).

thanks, and i do understand all that now, i think. so it seems this map uv node can be used as an alternative to the compositor’s displace node? if so i could certainly use that in my line of work.

the problem appears though that unless we use a very dense or highly subdivided distortion mesh, then the pixel precision that the node i was requesting couldn’t be reached? and i still would have the problem of matching the distortion to that of the 3d objects i had originally rendered animations for, but without rendering their uv coordinates at the time.

You can also paint into an image and port that into the UV node with a mix node. Here is an example using a blend type texture .


Looks like the Grid Warp feature in Nuke

And Photoshop :wink: You could do this just as easy on an actual mesh in 3D view but I like the idea of mixing other values for the distortion too.

I think that with tracking and UV displace you could create similar effects to the head resizing in the latest Alice in Wonderland movie…

wow, that is very cool. so are the render layer uv outputs similar to normal maps? if so we could do some mind bending distortion effects with these nodes.

and could they also be used for doing 2d morphs? i guess if we combined them with cross fading source and destination images that might work.

i’ve started experimenting with the map uv node and i’ve noticed there is a thin black border to my composites after i’ve used that node, is there a way to remove the border?

also i don’t mind if the admins want to move this topic into another forum section.

It depends on where the black border effect is coming from. How are you deriving the UV? Are you shooting a plane? Does it match the dimensions of the camera’s field of view? Are you mapping to the UV correctly? Perhaps show us a screen shot or 2 of the setup.

And yes this would be better in the compositing sub board :wink: .

my original uv input is from an image sequence, saved after rendering my 3d background mesh. i’m then mixing it will a grey scale displace image, again using an image sequence. all those use the same dimensions except the map uv image input, although the black border appeared with every image and aspect ratio i test with.

i am using a fast gaussian blur on the uv images to smooth out the very pixelated results i was getting from the map uv. but the black border is still there in the final output regardless if the blur node is muted or not.

the second image below shows the original 3d render before compositing. the third image is the map uv results without the displace channel mixed with the uv images, and the last image below is the distortion. notice both that and the second image have the black border.

http://th04.deviantart.net/fs71/200H/f/2012/325/6/3/150_by_rattyredemption-d5lr4pl.png

http://th05.deviantart.net/fs71/200H/f/2012/325/c/3/151_by_rattyredemption-d5lr506.png

http://th06.deviantart.net/fs71/200H/f/2012/325/d/1/152_by_rattyredemption-d5lr520.png

http://th07.deviantart.net/fs70/200H/f/2012/325/f/a/153_by_rattyredemption-d5lr53j.png

My guess is the blur is not wrapping pixels correctly and therefore shrinking the mapping value around the edges. Not sure of best solution there. Why are you blurring the UV value? I added a subdivision modifier to smooth that out.

i did explain this in my last post but i’ll elaborate, i was getting the black border before i added the blur node, and if i mute that node it makes no apparent difference to the border. if i mute the map uv node, then the border disappears.

i’m blurring because with out it, the map uv results are very pixelated. i’m not subdividing my mesh because it will change the shape of it, and slow down the animation render much more than smoothing out the uv’s in the compositor.

my conclusion is the map uv node is buggy as i can see no logical reason for the border or the vast reduction in quality without us taking extra steps to compensate.

Hmmm, I just tried the UV remap and noticed the border as well. Very odd. I believe that others use this for relighting, so I am surprised that no one else has noticed mis-registration. Did you submit a bug? They are being cleared quickly these days.