The New Particles .. let's figure them out

For a really good look to your hair/fur, and to keep it looking just right either close or far from camera, in the Strands material popup:

  • enable Use Blender Units
  • choose an appropriately small size (like 0.02)
  • set the minimum to the smallest size you’ve set for start/end
  • for extra fuzzy softness, double that minimum value so it’s bigger than the start/end size. it’ll make an “always a little blurry” fur.

I’m struggling with the colour of the hairs right now. What are the options. First I had a yellow material, and all the hairs came up yellow. Hurray! Then I textured the eye of my chick model, and used the TexFace button to make it visible. It seems now the hair rendering uses the TexFace colour space, which in my case meant the colours came up grey, not yellow anymore (although the materials colour is still yellow). Also, when I use vertex painting, the hairs don’t take on the colour of the vertex paint. In fact I need my emitter model pink and the hairs yellow and the eye textured. How do I accomplish this ? Also how does the material widget in the particle settings work ? It shows a particle number ? Why can’t I just select one of my named materials ?

questions questions questions …

Paleajed: DO NOT USE TEXFACE. texface is for game engine, really. Map using texture channels.

Papasmurf: True, but is there a feature or script that converts UV unwrapping texturing into texture channels ? This is a part of Blender I never fully understood.
If Im correct, I unwrap and texture in the editor, give a name to the UVTex, then setup a texture channel with mapping set to UV with the named UVTex and then again load in the picture I used as texture. Is this correct?
Why isn’t there a feature that takes all the settings I made during unwrapping, and kind of bakes them into a texture channel ?

Really never understood, never understood.

Getting a bit of topic (hair). I do apologize.

What I also don’t understand is why I can’t go back to the UV editor, select a texture picture and then get the faces UV mapped displayed again that were used by that picture. (or is there a way).
Also, suppose I load a new picture in the UV editor to replace the pold one, it should update the image in the texture channel also, doesn’t that seem logical (it does to me).

Please someone, recode.

or please, paleajed, do the official tutorial on uv mapping and using the tools correctly.

http://wiki.blender.org/index.php/Manual/Unwrapping_a_Mesh

You’ve been using the “quick way.” Now learn the “proper way,” and for goodness sake don’t ask for a recode when you don’t even know how to use the tool.

Can ya’ll try this blend out and see if it reproduces this behavior: Select the cube, and under ‘Visualization’ set it to object, and then hit alt-A. On my pc, the particles start wiggin’ out and flying all over the place. Now set it to ‘point’, is everything nice and orderly.

If so, is this a bug?

@paleajed: in the MapInput panel, click UV and enter the name of the texture in the field. Then load your texture image.

Can ya’ll try this blend out and see if it reproduces this behavior: Select the cube, and under ‘Visualization’ set it to object, and then hit alt-A. On my pc, the particles start wiggin’ out and flying all over the place. Now set it to ‘point’, is everything nice and orderly.

If so, is this a bug?

Definitely something wrong. But I did not get things looking nice and orderly by setting your simulation to point. It’s still screwed up.

I tried to set this system up from scratch but it seemed to be working okay when I did that. How did you get into this weird state?

Thanks for testing.

I’m using an SVN Build from graphicall.

The steps were: create UVSphere, create boids particle system on default box, set some parameters, like maximum velocity and lifespan, run simulation, change visualization to object, set object to previous sphere, run simulation. Craziness ensues. I find that in general, when trying to parent objects to boid particles, it generally works for one, but when you start adding others, it’s gets screwy.

I’m using SVN rev 1369M on Linux.

Can you post the blend you used that works? Did you use boids or newtonian?

Here you go. I did this also with a graphicall build. Fairly recent but not sure when.

http://uploader.polorix.net//files/421/boidstest.blend

I would normally… uh… hang on.

I was going to say that I would normally use Emit From Random, but I just set it to Emit From Random and it got suddenly weird on me.

Say, try using a Icosphere as your emitter. do you have the same problems? I’ve always used an Icosphere as a boids emitter and never had a problem. It seems to be working okay now too.

elam:

Select the Sphere and clear its location [Alt] + [g].

ah, thanks Jarell!

For what it’s worth, elam’s file still had problems for me with the sphere’s location cleared. I think it’s a caching issue, but I’m not sure.

is there a way to disable particle caching temporary? in some cases it’s a bit annoying. for instance always when you change or shift a force field around, you have to search for the particle pannel to clear the cache. fine-tuning of particle paths is hard work this way.

I agree with Kai, there are still a lot of bugs IMO with the caching - I’m not a fan of it, and I don’t see it as a benefit.

I wish there were more discussions and development on using particles as particles, not just hair/fur (though I know that’s the focus because of Peach). I tried doing a full project with the new particles for a client, but I had to revert to the old system due to Cache problems (as well as deflector problems).

Feature request: Make “Halo” texture raytraceable!!! It would be sooo handy if it could use RayTrans with IOR values, as well as showing up properly when combined with other raytrace materials. I shouldn’t have to mimic the halo with a tiny little cube/sphere to get my effects.

I wish there were more hair and fuzzy things in the oilfield and engineering/manufacturing sectors!!

I’ve found Janne extremly cooperative and he reacts quickly on issues with the particle system. So go ahead and post your problems to the bugtracker.

And yes, there are very many issues with the new particle system and I wouldn’t recomend it anybody to use it for (paid with a deadline) work. But the more people use it, the more errors are found and the better it will get.

And everybody is invited to read and comment on the particles documentation:
http://wiki.blender.org/index.php/Sandbox

I am a bit torn in my opinion on the point cache.
You might want look at this
http://mosebjorn.altervista.org/more/
for my work around to get a ‘global cache reset’
well at least it works for soft bodies
BM

@SoylentGreen feel free to use that example for what ever you want

Alright.

So i takes a plane and I uv map it, right?

Then I adds and image on to that plane, like a poster alright?

Then I subdivides that plane, and adds an explode modifier, ya follow?

I do’s the vertex groups stuffs and fiddle with the settings till I get the plane shattering how I’s wants, still listening?

Then I render it…?

In the 3D viewport everything goes as planned, mkay?

But in the renders, as the animation goes on, and the pieces of the plane turns the image stays almost the same.

Some of it rotates and some of it acts like majical glass, you see?

What I’m sayings is… the exploded parts simply don’t show the textures right.

Like if I tried to use it to make a picture of someone shatter, it would not do it right.

Anyone’s know if this is a glitches or is the explode modifier just not that in depth. (29h)

Currently I’m trying to apply soft-body specs to a parent-only hair system so it will have some movement during animation, but there seems to be an issue with using the soft-body Goal option. With a s-b mesh, weight-painting determines the influence of a parent bone on the s-b, so parts of the mesh can be made to follow the parent exactly and others can act as fully soft-body.

There doesn’t seem to be a correspond means to weight a hair system’s parent strands. Use Goal is necessary to animate a s-b (mesh or particle strand) with an armature, but the goal weighting is uniform and based on the minimum value, it seems. This makes the s-b hair system react either completely stiff (goal weight = 1.0), or it becomes more and more disconnected from the armature movement as the weight is reduced.

Is there a way to assign variable s-b Goal weights along the length of the strands?