I Need a bit of help finding a way to implement a technique in blender that I originally developed in max.
The context is wanting to put grass on a large terrain mesh. In max I used particle flow to scatter little grass billboards over the terrain automatically. The problem I ran into (predictably) was that the amount of grass that was required to give nice coverage when applied to the entire terrain was too much for a non supercomputer.
The solution I developed in max (without out getting into the gory details) was to put - and link - a large invisible sphere in front of my camera. Then I did a few tricky things to tell particle flow to only put grass billboards on the terrain where the sphere was intersecting. So grass was only spawned within a certain range of the camera (determined by the size of the sphere).
Now the blender bit. First of all, the new hair stuff is UTTERLY amazing, I couldn’t express enough thanks to the guys who have made this happen. You’ve set me free man :o)
So the thing with the new particle stuff is that it also does great looking grass, but runs into the same issues max did. When I saw a page at the blender site talking about texture based emission of particles, I remembered a tute by theeth about doing decals (using an empty to control the placement of an image on another objects surface) and got all excited about using an empty linked to a camera to control a blend (sphere) texture on my terrain that would control the emission of my grass. Alas, texture based emission appears to not be implemented for static particles.
So, how to make it work? Either it will be somehow making blender make a vertex group in an object based on one of it’s textures, or it will be making blender update an objects vertex group based on how much another object is intersecting it. But I don’t know enough about blender yet to figure this out, can anyone offer any suggestions?
P.S. If someone can help me figure this out, I’ll be more than happy to do a big tutorial showing everyone how to do it.