Method to distort coordinates according to fake velocity field?

I’ll preface this by saying I have a basic understanding of OSL and a solid understanding of vector math.

I am attempting to create a turbulent fluid texture that I will use for gas giant planet atmospheres (such as Jupiter’s cloud belts and zones) and other fluid-y uses. I found a method implemented as part of a game. Here are some example images: http://imgur.com/a/9LipP

The author of the images, Steve Cameron, has his code posted on GitHub and he includes a slideshow where he describes a method by Bridson for generating a realistic, but fake, turbulent fluid velocity field: http://smcameron.github.io/space-nerds-in-space/gaseous-giganticus-slides/slideshow.html#1

Cameron’s texture is some combination of OpenGL and/or C++. I’m not familiar with either of those languages, but I understand the math concepts behind Bridson’s method.

I’m trying to implement this technique in OSL for use in Blender. I have created a procedural velocity vector field based on Bridson’s method and a function to generate a distribution of particles to move in the texture space according to the velocity field. I currently visualize these particles as white dots with user-defined radii.

I am looking for some guidance in OSL programming at this point in my project.

In a test script I’ve written, I am able to modify the positions of the particles according to a simple function like sin(x) which introduces a regular wave pattern in the particles’ positions. However, when I use the same function to sample my procedural velocity field (vx,vy) the particles are distorted in shape and do not seem to trace the local velocity at each particle position. Instead, I am getting a disconnected sequence of irregular spots.

I’ve read Michel Anders’ excellent book on OSL and the OSL documentation. I find no reference to sampling a point-like variable at a specific x,y. I need a function call something like VelocityField(particle_x, particle_y).

I fear this type of call isn’t possible in OSL, but I hope there is some creative way to accomplish the same thing. As I understand it, the sampling of the texture is left up to the renderer, so the specific coordinates within the texture aren’t known until the renderer executes the shader. But, as I said, I’m still learning the “OSL” way of programming.

I’m attaching a zip file that contains my blend file, my osl file, a copy of Anders’ “haltonsequence.h” file, and an image that illustrates the velocity-tracing effect I’m currently trying to achieve (as a step along the way to distorted texture coordinates).

Perhaps I’m approaching the idea of warping the texture coordinates naively. I’m certainly open to other methods that mimic Cameron’s gas giants’ appearance or any other method for achieving the look of our Solar System’s gas giants.

Thanks in advance!

fluid-trace.zip (463 KB)

Hi. I’m the Steve Cameron you’re referring to. If your aim is just to make textures, there is a standalone program specifically for that, gaseous-giganticus, which is in my space-nerds-in-space repository on github. The textures aren’t actually created in the game – they’re done separately, but the code to do them is in the same repo as the game. Here’s a video showing how to use it (it is linux only though): https://www.youtube.com/watch?v=8nx5yPpQh2M

From what you’re describing, I expect you want to encode your velocity field(s) as a texture, that is, take the 3d vector at each point in your (presumably) 2d velocity field(s) and encode them into the pixel values of the texture with rgb mapped to xyz. Then you can sample your texture to get your velocity field values. I would strongly suggest getting this working outside the gpu first, as debugging gpu code is pretty tough. Porting this to the GPU is beyond my current abilities, and I wrote the damn thing.

Now one thing you might not have realized about my program is that I am not really just distorting the texture with the velocity field. Instead, there are a bunch (e.g. 8 million, typically) of particles whose colors are determined by some arbitrary mapping to some arbitrary image (typically a blurred vertical stripe of some earth tones works best). A particle’s color never changes. Then those particles are turned loose in the velocity field and move around (slowly) while (and this is important) leaving a slowly fading alpha blended trail. Well, that is to say, each iteration of the thing is something like:

  1. Move particles
  2. Alpha blend each particle into an accumulated image with some opacity (0.5, or something).
  3. Fade the entire image towards black, or towards some neutral color, a bit.
  4. go to 1.

(I say “the image”, there are actually 6 images making the cube map).

So it is not the case that you have a starting texture, apply a distortion via a velocity field in one step, and bam, get a nice texture out. No, you run a little simulation with particles roaming around according to the velocity field leaving alpha blended trails over time. That’s what gets you a nice texture. And eats up a bunch of time. Also, the textures tend to start out looking like crap, then they get better and better for awhile, and then after awhile, they start looking weird and terrible. There’s a sweet spot where they look nice. Probably not what you wanted to hear, but, that’s how it is.

To get a velocity field on a flat texture is pretty easy (see my “curly vortex” project on github). To do it on a sphere, and to do it seamlessly, is a bit trickier.

I have 6 velocity fields in a cube-map arrangement, and the “curl of the noise gradient” as it were is computed by the following method:

For every point in my velocity field (each element of my 6 2D arrays) do the following:

  1. Sample corresponding noise values in a 3d noise field at 6 axis aligned offsets of each location represented by that cell, detemine the gradient in x,y,z, add them all up to get a gradient at that point. That is to say, sample, noise at:
    (x-dx,y,z) and at (x+dx,y,z) subtract and that’s the noise gradient in x.
    (x,y-dy,z) and at (x,y+dy,z) subtract, and that’s the noise gradient in y.
    (x,y,z-dz) and at (x,y,z+dz) subtract, and that’s the noise gradient in z.
    where dx, dy, dz are some small epsilon value.
    add those together into a 3d vector, and that’s your noise gradient.
  2. Project that noise gradient vector onto a plane tangent to the sphere at that point.
  3. Rotate the projection 90 degrees clockwise (or counterclockwise – doesn’t matter, just be consistent) around an axis passing through the center of the sphere and the point of interest (rotate 90 degrees within the plane tangent to the sphere at that point). Store this 90 degree rotated vector in the cell in your velocity field – that is your velocity field value (you might need to scale it a bit) I think in the slide deck I may have said if it’s uphill rotate one way, downhill, the other, but this is not right – just rotate one way and be consistent which way.

The usual method of computing the curl as it is typically described in math books won’t work as it is axis aligned, and for each point on your sphere, the tangent plane you need the curl in is different. Not to say there isn’t some (unknown to me) hotshot mathematician’s way of computing the curl of a spherical gradient with respect to to the surface of the sphere that is better (more efficient) than what I’m doing (which is some crude vector manipulation using quaternions.) My way had the advantage of being comprehensible to me, which is a huge advantage. Debugging this kind of thing is pretty hard, because when you get something wrong, it’s really hard to tell from the images that come out of it what went wrong – only that something is wrong. I spent about 1 month chasing down a single bug, the symptom of which was that there were huge seams showing at the edges of each of the six velocity fields.

Good luck.

Hello,

I would like to help, but since i’m a bit stressed by time at the moment i will try to give you a way that might work:

shader Vec(
    output closure color CL = holdout()
){
    
   // You have to distort the u and v vars
    u = u*(noise("perlin",length(P)*10));
    CL = emission()*u;
    return;
}

With only u being distorted i already start to get some results:

And with some more distortion:

shader Vec(
    output closure color CL = holdout()
){
    
   // You have to distort the u and v vars
    P = noise("uperlin", I-P*20);
    u = (noise("uperlin",pow(I*5,2*length(P))));
    v = (noise("uperlin",pow(I*6,2*length(P))));
    CL = diffuse(N)*(u*v)*0.9*color(0.5,0.2,0.1);
    return;
}

You get something like this:

I hope i’ve helped a bit,

Cheers!

Attachments



Hi tree3d! That was exactly the blind spot I was suffering… it didn’t occur to me to use P to modify u,v. This gives me a whole new perspective and a new way to approach my problem. Thanks!

I’ll post here again as my experimentation progresses… and I’ll be happy to see what else you invent as your time allows.

Hello, interesting thread.

Sorry to the op if this doesnt belong here, but I have tried to use the standalone program gaseous-giganticus without any luck.
Do I have to compile the program or is it working out of the box.

I am using Linux Mint 17.1 and if I try to compile the program the console gives me following error: fatal error: png.h. missing file

What am I doing wrong?

Greetings

Ah, my post finally got approved.

You have to compile the program. There are some dependences. For png.h, it’s likely that you need “libpng12-dev”

apt-get install libpng12-dev

You can “apt-cache search blah” to search for what packages match “blah”, then 'apt-get install blah" to install them.

I don’t know of a nice way to go from “blah.h” not found to --> “apt-get install package-containing-blah.h” other than a bit of educated guessing about what the name of the package it might be in and searching with “apt-cache search”.

You probably also want to do “apt-get install build-essential” to get the compiler and headers and so on installed, if you haven’t already.

gaseous-giganticus has few dependencies, libpng may be the only sort of non-standard one (ie. not part of libc.) If you want to view what the program creates as a sphere (and not as just 6 square images) then you’ll need some program to view them on a sphere. mesh_viewer is also in the space-nerds-in-space codebase – it has a lot more dependencies than gaseous-giganticus does. The following is a (possibly incomplete) list of dependencies:

apt-get install build-essential
apt-get install portaudio19-dev
apt-get install libvorbis-dev
apt-get install libgtk2.0-dev
apt-get install git
apt-get install stgit
apt-get install openscad
    (or get it from http://www.openscad.org/downloads.html)
apt-get install libgtkglext1-dev
apt-get install liblua5.2-dev
apt-get install libglew1.5-dev
apt-get install libsdl2-2.0-0 # version may differ for your distro
apt-get install libsdl2-dev # version may differ for your distro
apt-get install libssl-dev
apt-get install libttspico-utils # for text to speech

Some of those aren’t strictly needed for mesh_viewer (e.g. openscad, stgit, libttspico-utils, probably)

So the 6 textures that gaseous giganticus produces for a cubemap are arranged in a particular way (by a convention that I uh, just made up.) mesh_viewer is aware of this convention, but other programs, not so much. So to use those textures with other programs may require some image manipulations to get things in the orientations that such programs might expect (ie. rotations of multiples of 90 degrees in some way.)

When I build gaseous giganticus, this is how it compiles:


$ make V=1 gaseous-giganticus
gcc -DPREFIX=. -g   --pedantic -Wall  -pthread -std=gnu99 -rdynamic -I/usr/include/lua5.2   -c -o mtwist.o mtwist.c && true '  COMPILE' mtwist.c
gcc -DPREFIX=. -g   --pedantic -Wall  -pthread -std=gnu99 -rdynamic -I/usr/include/lua5.2   -c -o mathutils.o mathutils.c && true '  COMPILE' mathutils.c
./gather_build_info > build_info.h
gcc -DPREFIX=. -g   --pedantic -Wall  -pthread -std=gnu99 -rdynamic -I/usr/include/lua5.2   -c -o open-simplex-noise.o open-simplex-noise.c && true '  COMPILE' open-simplex-noise.c
gcc -DPREFIX=. -g   --pedantic -Wall  -pthread -std=gnu99 -rdynamic -I/usr/include/lua5.2   -c -o quat.o quat.c && true '  COMPILE' quat.c
gcc -DPREFIX=. -g   --pedantic -Wall  -pthread -std=gnu99 -rdynamic -I/usr/include/lua5.2   -c -o png_utils.o png_utils.c && true '  COMPILE' png_utils.c
gcc -DPREFIX=. -g   --pedantic -Wall  -pthread -std=gnu99 -rdynamic -I/usr/include/lua5.2   -c -o gaseous-giganticus.o gaseous-giganticus.c && true '  COMPILE' gaseous-giganticus.c
gcc -DPREFIX=. -g   --pedantic -Wall  -pthread -std=gnu99 -rdynamic -o gaseous-giganticus -pthread -isystem /usr/include/gtk-2.0 -isystem /usr/lib/x86_64-linux-gnu/gtk-2.0/include -isystem /usr/include/atk-1.0 -isystem /usr/include/cairo -isystem /usr/include/gdk-pixbuf-2.0 -isystem /usr/include/pango-1.0 -isystem /usr/include/gio-unix-2.0/ -isystem /usr/include/freetype2 -isystem /usr/include/glib-2.0 -isystem /usr/lib/x86_64-linux-gnu/glib-2.0/include -isystem /usr/include/pixman-1 -isystem /usr/include/libpng12 -isystem /usr/include/harfbuzz   gaseous-giganticus.o mtwist.o mathutils.o open-simplex-noise.o quat.o png_utils.o -lm -lrt -lpng && true '  LINK' gaseous-giganticus


Then don’t run it. Your question is kind of strange, like asking how to go scuba diving without carrying the big annoying tank of air. :slight_smile:

That being said, a couple solutions spring to mind, though they may be worse than the problem they avoid. 1) Buy a second computer, and run it on that. 2) Run in a virtual machine to isolate.

I am looking for some guidance in OSL programming at this point in my project.

I have a mesh with triangular faces and at the center of each face I have a speed value. My goal is to create water ripples consistent with the calculated speeds. These values ​​are saved in a text file. The resulting texture will be used as a normal map. Can you give me some suggestion?
Thanks very much