Deadmau5 pulsating visuals?

Hi folks,

Please take a look these Deadmau5 visuals, at around 1:00

You’ve got these arrays of spheres, and each line of spheres is pulsating along it’s length in terms of sphere size.

I have no idea how to achieve that sort of geometric animation, other then be actually painstakingly animating it by hand.

But i feel there must be some faster way. I know this stuff is possible to do in Houdini (if you know how to use it), and probably C4D, because it’s quite Mography, but how could one achieve similar sorts of effects in Blender?

I don’t expect a “do XYZ” explanation, i just need some pointers to general techniques and workflows to set me off in the right direction!

Thanks!

Animation Nodes would be a good tool for effects like this.

So what you have here is stuff that’s frame delayed. Basically, you’re looking at the amplitude of some frequency band of the song, where stuff at the periphery is looking at the amplitude for right now while stuff at the center is looking at the amplitude for, say, 10 frames ago.

Blender’s not great at that stuff, because it requires remembering what happened last frame, two frames ago, etc. Blender’s not designed to do that.

That doesn’t mean it’s impossible-- you can bake animation data and make ten copies, then in the graph editor, shift each bit over 1 frame to the right.

It’s not really all that hard. Let’s say my estimate of ten frames is right: you divide your spheres into ten concentric circlular areas. Each set is copying scale (offset) from one particular empty, of ten. The scale of the empty is baked from keyframes generated by the song, and then copy/pasted/offset to each empty in turn.

The other way to do frame-delayed effects, in vanilla Blender, is via physics, which would be more tedious and much slower to render.

As Lumpengnom mentioned, you could also do it via Animation Nodes, which would be more non-destructive, more tunable, but probably not any easier.

@Lumpengnom, thanks, I’m going to look into animation nodes, never dabbled with that before, looks very powerful but also very tweaky! :grimacing:

@bandages, also thanks!.. to be honest i don’t think the music is influencing it as much as you think. Obviously the kick drum 4-to-the-floor is driving it, and also maybe the snare hit, but i think that’s it.
Could it be the case that it’s actually just one array of sphere’s (about 20, in a straight line, stretching away from the camera) that’s been copied over the screen with the timing slightly offset (some more than others) on each one?

Another question to you guys: what about the fact that the sphere’s get darker as they recede into the distance. How one might go about this? I’d really like to be able to do this in the viewport (as opposed to in the compositor or with a render pass) so i could design my graphics around it.

I found this way: https://blender.stackexchange.com/questions/27639/how-to-mix-2-shaders-depending-on-distance-from-camera
But then you get a fresnel effect which ruins the “flat” look… so i guess one could use flat disks and track those to the camera, but maybe there’s a better way?

Here is a setup for materials like this. Explanations are in Node frames. You can move the camera as well.

Looking at the video again you might want to limit the falloff to a single axis.
You would do that by using a “Separate xyz” node after the “Vector Math - Distance” Node and feed only the x or y or z axis into the “Map Range - Control Distance” node.

Thanks dude, i’ll check that out later today.

In the blend were you using spheres or disks?

If spheres, does your setup keep a solid color over the entire disk (so, no fresnel or gradient on each sphere?

Yes, it is spheres and keeps the color.
The “Object Info” node gets the position of every individual object, in this case every individual sphere.
The “Distance” node is a “Vector Math” node set to “distance” and it calculates the distance of the spheres center to the camera.
Because the color is based on the object centers distance and not the geometrys distance the color is solid over the whole object.

If you decide you want a gradient after all you can replace the “object info” node with a “Geometry” node and use its “Position” output instead.

BTW, with the “Curve Node” you can insert some variance into the falloff. For example you could create a wave pattern or something like that:

Try this…
https://youtu.be/l8RHRNZQEuE

What about rendering the first layer zoomed out in 2x resolution (to capture all the extra data), then simply zooming in and blending it with only the frame before it (slightly darkened and zoomed out) in the compositor?
The first frame will only have one layer, but the second one will have 2 an the third will have 3 and so on, until it becomes too dark to see
edit: Not perfect, but here is the result
out.mkv (520.4 KB)

Edit 2: I fixed the issues. It was way simpler than I thought.

deadmau5.zip (365.9 KB)
if you download the blend files you just have to open each file in order (spheres, compositing part 1, compositing part 2) and render the animation. No other action needed.

How it works
1- (spheres.blend) it creates the first layer of circles, this has to be black and white. You can control it manually or through animation nodes or python scripts or any other method, as long as it outputs a white circle on a black background.
2- (compositing_part_1.blend) Takes the current frame (which was rendered in the first step), then blends it with the previous frame. The previous frame is slightly darkened and scaled down and mixed behind the current frame. It will overwrite the current frame with the result so this whole step can be repeated in the next frame
3- (compositing_part_2.blend) just makes it red

It should serve as a good and simple starting point :slight_smile:

1 Like

@Lumpengnom, great info, many thanks!

@RSEhlers, see here!: Midi driven animation?

@abdoubouam: nice, that works great! … doing it with compositing is less flexible though, i’d really like to be able to see it “live” in the viewport, i think Blender should be capable of that, right?!

@Lumpengnom, sorry to bug you, but could you tell me (in your example), what are the node names i need to search for the nodes you labelled “driver to camera positions” & “control distance”?

(i think this is something that could be improved in Blender: often, when looking at a node try example image, one doesn’t know what some of the nodes are as there are not named!)

I don’t know animation nodes enough to know how to delay the reaction, but for the lights and brightness it should be fairly easy to do in shaders

The driver to camera position is a “vector math” node. After creating it you have to set it to “distance” from the drop down menu.

The “Control Distance” is a “Map Range” node.