(The upper-left frame should say “Proy Y” instead of “Proy X”, sorry.)
This is the setup:
I’m using the Displace modifier on a 512x424 flat grid. The displacement map for the modifier is a sequence of depth frames of the same resolution, so every face of the grid corresponds to a depth pixel.
Then I set the material of the grid to Halo, now we can only see the vertices of the displaced grid.
Next I added a texture to the base material and map it to be projected only in the X and Y directions of the grid.
So far so good:
Now, I only want to show certain sections of the point cloud, lets say, only the torso and a bit of the background. So I added a texture to the same material to be used as a Z projected mask:
The problem is, I don’t know how to “combine” the Color texture and the Mask. I need to have a texture on the point cloud and at the same time I need to only show desired segments of that textured cloud point.
I thought that using two different textures in the same material (one for color influence, the other for alpha influence) would do the trick. I’m missing something?
I also tried some alternatives: Intersecting “dot killer” cubes with the Boolean modifier messes up the geometry, using black planes to hide the undesired dots limits camera movement, particle systems sets fire to my pc, nodes won’t work on halo materials and that makes me sad.
Interesting. I tried doing it in a bit more simple way, using a plane or a box, subdivided and turned to points, covering a character and using raycast to project the points onto the character.
But I am running into performance issues after a few subdiv levels, it is on an animated mesh.
This was done in 3.1 I believe, back in 2.93 I was using the halo material in the old blender renderer, this new one was made in eevee.
The result is not quite Radiohead’s however it was close enough for the client.
I used geometry nodes to distribute quad meshes on a 512x424 grid, this grid is displaced by the depth texture captured from a kinect for xbox one and the quads act like billboards always facing the camera.
The animation of the particles flying out of the drums was also made in geometry nodes. Each actor was recorded separately with their respective instrument, except for the drummer, we made a makeshift cardboard drumkit so the kinect could see his entire body while still performing, this means the floating drumkit seen in the video is modeled.
My original problem was solved with shader nodes, I used a spherical gradient texture as a depth cut connected to the alpha of the material so only the depth region on which the actors do their action is visible.
If I can find the original files I will upload the node trees! Also RIP youtube bitrate.