Blendgraph v02 (release)

I have added my official test driver PNG sequence to ShareCG.com.

Use this sequence with the attached BLEND file above to see Blendgraph in action!

i’ll try then… Thanks very much! :wink:

I got distribution over a mesh working. Blendgraph v03 is on the horizon…:slight_smile:

Attachments


You’re making great progress. I look forward to the next release.

I don’t get it, but i like it :smiley:

!!! distribution over mesh looks GREAT!!!

Cool update Atom!

This looks like a great script! Can’t wait to have time to really check it out. Keep it up!

I have integrated the distribution mechanism into the GUI. Now you can specify a mesh in the GUI as the distribution mesh without having to edit any code.

Presently I have implemented a simple per-vertex distribution system, and wonder if anyone else has any ideas on how to proceed?

Consider this:
I am sampling an image say 64x64 times. This means I can have a maximum possiblity of 4,096 blendgraph units being managed by the script. Out of those 4,096 possible units only a small portion may be actually linked in on any given frame, determined by the alpha channel of the sampled image.

The distribution mesh can have any number of verticies. In the image I have posted below, the Suzanne mesh has 2,012 verticies. Currently, I just blast through the vert array and treat it like a grid and when I run out of verticies, I exit the loop. But this can leave unsampled pixels in from the image.

I am wondering if anyone has any “mapping” ideas? I want to avoid a UV based solution simply because not every mesh is going to have UV coordinates and it is kind of nice to simply pick a mesh and get instant gratification.

Attachments


New Video Produced with Blendgraph.
http://www.vimeo.com/2862002

I am finally getting a chance to play around with Blendgraph v02. This video was produced with an altered version of v02. The only alteration was the way the “wiggle” function handles the LFO inputs.

The animated sequence that is driving the output is not featured as an overlay, but is a 256x256 image that was sampled with an X/Y rate of 48.

I discovered that if I premultiply the alpha of texture in Blender, Blendgraph picks up on that and produces a smooth falloff of objects along the alpha blur. This is a great discovery and boosts the object count quite a bit.

Featured music by RNC, an excerpt from “The Gift Of The God”.

Here is a still from the video…

Attachments


cool new vid, this script is darn cool!

Looks great! But can we get a cleaner project file? Currently, there’s so much to undo before you can get started with experimenting. Is it possible to deliver this as a separate py file(s) that can be dropped into the scripts directory, a simple demo blend file, and a simple tutorial? I think it’s a great script with a lot of potential, but I think it needs a better delivery so people don’t feel like there’s so much to come to grips with before being able to experiment with their own ideas. Thanks.

Thanks for the feedback.

Here is a quick description to help you get up and running!

Remember to download the driver image sequence that is listed higher up in this thread. You will browse to this later, so go ahead an unzip it in a semi-permanent location, or where you keep your image maps.

THE SCRIPTS:
The .py scripts are in the BLEND file. Blendgraph is made up of three python files(Blendgraph_LIB_v02, Blendgraph_GUI_v02, Blendgraph_FRAME_v02) and a single INI file (Blendgraph_INI_v02). The INI file holds the internal settings of the GUI. If the INI is deleted, the GUI will recreate the file with defaults. Also notice that the INI file has F in front of it. This is because Blendgraph uses the INI to store information rather than the registry. The F means the Fake User flag is set. The result of this flag means Blendgraph can save the GUI settings when you save your scene. I was not able to achieve this using a registry/dictionary approach for storage.

The .py file you execute is the Blendgraph_GUI_v02. The FRAME py is hooked to the frame change event and is already setup in the Blendgraph_v02.blend posted at the top of this thread.

INSTALLATION:
I actually built a pseudo installer for Windows right into the GUI file. After you launch the GUI, it will attempt to access a variable that is in the Blendgraph_LIB.py file. If this acess fails, the GUI determines that the library is not installed and simply writes the content of the Blendgraph_LIB_v02 text window to a python path and then tries to import the Blendgraph library module. If this fails, Blendgraph will not work and you will have to manually save the contents of the Blendgraph_LIB_v02 text window to the PYTHONSCRIPTSPATH on your machine/ OS.

I am new to OS’s other than Windows and to scripting for multiple platforms, so please feel free to post any console error messages that are generated by this BLEND file.

THE BLEND:
The contents of the blend file is clean, I spent some time on that before the release. If you are on Windows, simply double clicking the BLEND file should bring up the screen grab I am attaching. If you do not see the GUI, then the Blendgraph_LIB_v02.py file is not in the PYTHONPATH. Correct that and re-open the blend file. While the layout may seem cluttered, there is a reason for this specific layout.

THE BUG:
As I mentioned early on, Blendgraph exposes a bug in Blender. The bug is quite simply this. If a plane has an image sequence or movie mapped to it, the frames of the movie will only update if the plane is selected AND the texture window is open. That is why the bottom window of the Scripting layout is divided in two.

WORKFLOW:
Open the Blend file you downloaded from the top post in this thread. If the GUI does not show up then you have to re-visit the installation text.

1st thing to do is to assign the bg_driver PNG sequence to the bgControl object (this is the pink square) texture image. Browse out to the bg_driver animated PNG sequence you downloaded. Simply pick the first image in the folder. Then remember to change the image from a still to a sequence (or no layout animation will occur) and click the AutoRefresh button. The sequence is 360 frames long. For more detailed output click the PreMul button.

2nd Rewind to frame #1. Click Delete in the GUI, then Click Generate. The progress bar will move along informing you as the Blendgraph units are generated for use. Blendgraph only creates objects one time. It uses linking and unlinking commands to bring them in and out of the scene as you scrub the timeline. You can review the console window for information about how many units are generated per frame.

After generation, you will have exactly X Sample Rate * Y Sample Rate objects in your scene. This blend file generates 1,024 objects with it’s release setting. If this is too much for your system, try smaller sample rates. I have successfully had Blendgraph update more than 800 objects per-frame on my 3Gb Windows XP system. As object count goes up, the frame takes a longer amount of time to process. Mesh density also will affect refresh time.

3rd Click the next frame button in the timeline to move to frame # 2. Blink! All my objects disappeared. What happened? Blendgraph has updated the frame and determined (by sampling frame #2 of the image sequence) that no objects are required in the scene for this frame #.

4th Continue to scrub the timeline and observe the results as Blendgraph samples the image sequence and links objects in and out of the scene. Their overall distribution should mimic the bg_drive PNG image when observed from the top view.

5th Select the camera.

6th Scrub the timeline and you will notice that the object distribution no longer changes, yes you will see the LFOs affecting the existing Blendgraph units, but no linking/unlinking is occuring. THIS IS THE BLENDER BUG. Because the camera is selected, and not the plane that contains the material with the animated PNG sequence, no new image is loaded from the image sequence so the distribution remains the same.

TWEAKING:
To remove the influence of an LFO simply set the Amplitude to 0.0. Same thing with Pixel mapping. Set the Multiplier to 0.0.

GROUPS:
The Blendgraph units are derived from the meshes in a group. To qualify for a Blendgraph unit, an object in a group must be a mesh without any modifiers.

HERO:
The hero object is activated when the color CYAN is detected during the image pixel sampling process. If the color is detected, the hero object is used instead of the object from the group.

TRACK TO:
You can make Blendgraph units track to another object in the scene. This is a one-way street, however. Once you track to an object in a scene, you can not un-track, even with a Delete. I have not figured out how to remove constraints from object after they have been added. A quick fix for this is to Delete all the objects in the scene, Save the scene, Exit and re-open the scene. Then you can re-Generate and the new Blendgraph units will not have the constraint. (Remember to clear the Track-To name before you re-Generate).

THRESHOLD:
The threshold is a way to filter object count. If the alpha of the sampled pixel is below the threshold, that pixel will not generate a unit.

SPREAD:
Spread is a way to put space between or crowd generated units. The GUI is interactive, simply wait for the frame refresh if you have a large object count.

RGBA:
These buttons allow you to map any pixel channel information to any axis control of translation, rotation and scale. Alpha maps to Z-Scale is the most useful, in my opinion, but play around with it! This is what makes Blendgraph like After Effects Card Dance.

MULTIPLIER and SCALE TYPE:
The multiplier does just what it’s name implies. It multiplies the affect of a channel on a given axis value. Try out the different scale types.

LFO:
The LFO is an automatic way to animate Blendgraph units. The LFO information is applied after the pixel mapping occurs. Frequency and amplitude are applied to an axis value. The RND function tries to add some randomness to the LFO output.

PLAYBACK and RELOAD:
You can not simply press the play button on the timeline and expect Blendgraph to run at full frame rate. This is a pipe dream. I have found that simple scrubbing of the timeline, even with the image plane selected can cause Blender to lose track of what image in the sequence to load. So I find myself repeatedly clicking the RELOAD button on the image sequence. That is another reason why I have a divided window along the bottom of my interface. Even after you press the RELOAD button, you have to move to another frame before Blendgraph will update the scene.

RENDERING:
The same rules apply for rendering as do scrubbing. You must have the image mapped plane object selected and on a visible layer before you begin rendering or the PNG sequence will remain on the last frame it loaded. What I do, to make sure it is working is to watch the output of the console as I render. The console prints the information of what frame number it loaded from the PNG sequence. If this number does not change, then you probably forgot to select the plane with the image sequence before you issued the render. Also, of note, is the fact that during rendering, Blendgraph always lags behind by two frames. I call this another Blender bug related to image map refresh. But the gist of this is that you need to let at least 3 or 4 frames roll by before you determine if the image sequence is indeed incrementing.

Well, I hope this helps…

Attachments



Thanks for the info!

the new tutorial images are MUCH appreciated, as is your hard work :slight_smile:

Wow, this is pretty neat. I’ll have to play with it sometime.

Just seen your videos.Very impressive.This script is great ! Keep working on it.

great man , great script will try it when i can .

thanx !

just a bump to know if this is being ported to 2.5 or there is in mind of someone to do it… This is a great piece of code! (Thanks Atom!)

Impressive. An update to 2.5 will be great. :wink: