Blender's Particle System Is Terrible

Thank you for a response that wasn’t immediately defensive. The community so far gives me this contradicting ideology of “all of our ideas count, but don’t criticize!”
I’ve read about the particle nodes feature that was in development, then mysteriously died over a year ago. It’s things like that which I do not understand why they are not prioritized when they so desperately need to be. If I could describe Blender’s development philosophy in a single phrase, it would be putting the cart before the horse.
However, I read that future of Blender post that was made recently that suggests particle nodes may still be on it’s way a few versions from now, and I cannot wait for it. I mean that half literally.

I’ve scoured these forums and the internet to no avail. And how do you do any of that?

I’ve never even heard of it, but after seeing this (and yes the updated version) it only marginally solves these problems, and has made the standard workflow even messier and more tedious.

Thanks, but this is for Blender 2.4, which has useful features shown here that didn’t even make it to 2.5, and still is nowhere near a solution.

You mean I can donate money to the dev team for these specific features to happen? Where, how, and what amount?

This isn’t a rhetorical question. I want to know. You said nothing will happen unless I put some real money forward, right? Well now is your chance to back up your criticism. What do I do to make these features happen?

Email Ton or Brecht and try to get a conversation started and see what they say - but I do think that Psy-Fi will be working on particles for the movie project.

Good to know. Is there anywhere this is said or documented? I’d like to read about it.

As far as Psy-Fi’s workload, it was mentioned in the Gooseberry notes in the Sunday meeting reports a few Sundays ago. Since they will be working on an animated character with particle hair, as well as other vfx particles, we might see something that helps your desired workflow - but it won’t hurt to talk to some of these devs to find out.

I completely understand the OP’s frustration. Did quite a bit of fighting with particles today. Hope to see the whole thing switch over to a node system.

Of course what I want more than anything is truly nested particle systems: The ability to emit objects that have their own emitters. All without static workarounds, excessive geometry, etc.

One of the things that floor me with Blender’s current system is how it’s not possible to create the ‘hello world’ of particle system effects which is a simple firework (is Blender the only software available for the PC right now that can’t do this?). It’s good for covering surfaces with instances, making swarms of objects, making streams of objects, making emit effects that don’t require multiple stages like steam and wispy smoke, and a bit in the way of hair effects, but there’s not too much beyond that really.

On top of that, for the past few years, no one in the dev. scene has really seen any sort of interest in it outside of Lukas Tonne (Jahka, the old particle system developer, raised the white flag years ago when he found out just how much work would be needed to get the code to a modern and clean state). If Atom is right and the foundation does employ Lukas to develop advanced particle node functionality needed for Gooseberry, then we should start seeing some major advances in Master later this year, until then, you would find that even a number of free and cheap game engines have better particle systems for various things (not to mention that even Blender 2.49 has particle functionality that never came back).

I know, you might be wondering where everyone’s priorities are, but the idea you have a lot of powerful options and few, if any basic ones to connect everything together is something almost exclusive to FOSS, though in Blender is tempered a bit thanks to having a core team on paid contracts (which means you actually have the power to put them on code maintenance, bug fixing, basic functions, and other boring stuff).

Yeah. Particles ftw, only thing that is important in VFX and 3D, why did the devs drop the ball? Oh, yeah, because they have been working on everything else all this time. It ain’t gonna fix itself, and the devs aren’t exactly sitting on their hands waiting for stuff to do. 2.70 is almost here, and they have plenty of Gooseberry targets and regular targets for 2.71, 2.72, etc. so I don’t get the gimme attitude that seems to be running in some of the threads here.

This tutorial seems to do this. I haven’t gone through it all, but it mentions particles that emit other emitters.

Truth.

Game engines from 10 years ago had better particle control. Seems there are basic things that should go into any particle system design that I cannot do in Blender without strange, convoluted workarounds.

Sometimes, I wonder if the lack of attention is because of the lack of use or the lack of use is because of the lack of attention.

Hopefully Gooseberry will be the catalyst for improvement (total redesign).

I have a question on the particle system myself. Whenever I watch a tutorial on fire and smoke … they use Blender Render. They then go on using non-surface material (volume) etc. I find the system to be fairly confusing, with setup being done at the domain cube, the particles, the material…

But anyway: I use Octane Render. That works fine with cloth & fluid simulation … blender only renders meshes, and I can rendr each frame normally with any render engine.

Not so for fire and smoke it seems? Am I stuck with blender render for this and the physics engine can not make use of 3rd party renderers here?

Yeah particles are definitely one of Blenders achilles heels. Far as I’m concerned, Softimage ICE and Houdini are the only two worth anything. Maya has a lot of control as well, but that’s with a lot of expression work behind it. Honestly, I’d be happy with Blender’s particles if it even had the expression editor.

Here’s hoping to some more movement with particles!

Blender is an excellent choice for small studios, even with it’s flaws. But if a small studio one day grows bigger and cannot afford to live with Blender’s flaws, it would be wise to just spend the money, either hire developers to fix the flaws or buy a full feature software

Has anyone tried RE:ticular with 2.70? I’m going to give it a shot this weekend and it’d be nice to now if there’s any serious trouble I shoulod expect…

Well… sadly I agree that particles are awful and I have spent SOOO much time trying all those workarounds… uuh… something really should be done about this. Particle nodes and so on…really neeeded.

Freelancers need those things just as badly.

Wasn’t there some rumor or suggestion that somebody was working on node based particles? I think that’ll be the thing we have to wait for. Whether or not that comes as a development from the Gooseberry project? Who knows? But node generated behaviors is some pretty cool stuff - would be great to see it applied to particles.

Many valid points on original post (despite attitude), but I’m not sure if he realized some properties for particles can be driven by texture/material data. Fade in/out with particle time texture input and gradients, etc. Not readily obvious though, and they do take a bit of digging to get at.

Freelancers need those things even more so. A large team can handle inconveniences in a workflow. Individuals cannot afford issues like this.

I mentioned the particle nodes system that quietly disappeared nearly two years ago (which SHOULD be a priority development). And yes I am aware of using ramps for color and alpha over time, but it’s not only unintuitively implemented, doesn’t even scratch the surface for what is needed in particle animation.

The mailing list had an indication of a particle refactor branch being worked on by Lukas, so he seems to be making progress towards getting Blender a much better particle system (and create a better justification for this forum to exist).

@Mr_Lange I can’t be more agree with your points. Personally, I believe the Blender needs 2 (two) small improvements with big consequences to increase it’s usability and user base dramatically:

  1. Driver ‘Age’ – which will produce the time in frames since the instance/object became visible (hide_render = False). So, if I will be able to put it in size/color/transparency/etc calculation. For example, when object will be instanced by the particle system, the formula will be taken into account. This will be good not just only for particles, but for them too.

  2. Text-to-Texture. Why it isn’t possible to just add the text to the texture without all those (scary) baking voodoo dances? But it is just a side note.

Now back to particles. I was trying all sort of things in python to make Size-over-Age feature, and I’ve got very interesting results… to say the least.

In python I can change particle size during the change of frame phase and/or before render phase – I can get particle’s age and lifetime, divide one by another and use as a direct multiplier or via gray gradient color to map the age to the size. That’s what I did in python. And when I started the animation I saw this:

I was nearly jumping out of my shoes! But… When I rendered it in BI I’ve got this:

Cycles maybe? Nope, no chance.

So, no hope? But… wait, what about OpenGL render?

A-ha! Yafaray?..

Cool! Mitsuba:

The conclusion is obvious – internal engines parse the primitives in a DIFFERENT way comparing to the exporters. So, I believe any external renderer will work (most likely) correctly. But not the internal ones.

I don’t know, what exactly they are doing right before firing BI or Cycles (maybe some sort of extra pass or “optimization”), but please, guys if you reading this – change it, remove that extra pass and at least for now we will have a temporary solution. It will cool the things down and will give all of us a peace of mind. It will be also more logical to behave in a same way with all renderers anyway.

Thanks

P.s. I’m in the middle of finalizing the script so it will be provided in form of add-on.