Lets talk about Mantaflow

Mantaflow in the current state is simply nowhere near usable for production work, and probably unusable also for any kind of hobby work. Messing with it is mostly waste of time. It makes much more sense to spend 25 bucks on monthly subscription of EmberGen and making all your simulations there.

I’d go as far as to argue Mantaflow should not be a part of Blender in its current state at all, because it misleads users into thinking Blender actually has a solution for fluid simulation. Blender has fluid simulation only on paper, not in reality.

3 Likes

Does EmberGen works for liquids too? What other standalone program would be a good solution for liquid fluid sims?

I’ll give it try at the Flip Fluids addon again, though when I bought it in 2019 it was also very frustrating to work with :unamused:

I think at this point with openvdb support its more productive to look to external tools that are specialized in simulation and make better use of your hardware.

An example would be Storm. There is a free demo version (no idea bout limits). The indie license is 149 bucks. I have no experience with it myself but with Blender having better I/O I think its definitely a good road to explore.

https://effectivetds.com/resources/fx-tools/storm/

1 Like

Thanks, I’ll take a look at it :nerd_face:

That is not what I said. I said that you could upscale mesh resolution to facilitate its smoothing. Absolutely not to add details.
What I meant is that you can obtain a decent dynamic of fluid at a resolution of 64, 96, 128 and a mesh at an higher resolution to obtain smoothed little drops instead of sharp-edges big ones.

I don’t expect you to have a decent result at 32. I think that is hard to obtain something satisfying for liquids under 48. Default resolution was increased to 64 in 2.82 when Mantaflow was introduced.
In 2.79, preview resolution for old fluid simulation is 45 and final resolution was 65.
But people complaining about speed, comparing resolution between old smoke system and mantaflow, default was change to 32. Which is a bad default for liquid.

The old simulation resolutions (preview and final) did not really have as purpose to have a preview faithful to final end result. Both simulations were computed, at same time. So, when baking was finished, both simulation were available. Role of preview simulation was just to provide a less heavy to display mesh animation.

That is why the old idea of a preview resolution was and is irrelevant.
If you want a preview, just think about displaying only particles.

You should deal only one base resolution for the preview and the end result.
I would like that obtaining pertinent values for particle radius, randomness, maximum, minimum, narrow band width could be automated according to resolution setting.
We don’t have such problems for smokes.
But I am not sure if sebbas knows how to do that.
He renounced to do a Real World Size setting to adapt Diffusion settings.

Mantaflow is not Realflow. It simply does not work the same way.
With mantaflow inside Blender, we are currently limited to one Narrow Band Width method ; where Realfow propose lots of particles solvers that can be more close to old particle system SPH fluid type.

The idea was to stick with a term that was familiar and already used for smokes before Mantaflow.
I am not sure if Grid Size would evoke divisions of the grid as it should.
Increasing that setting does not scale the domain. It really subdivides it into more smaller cells.

If you don’t generate the mesh, at same time ; but only bake particles during your tests ; baking will just take dozen of minutes or few hours instead of a dozen of hours.
When dynamics is validated and particles baked ; you can try only first frames of simulation to figure out correct mesh settings. And when you are OK, you can bake mesh data during night.

That is far to be as easy and fast than a realflow workflow. But baking process is faster for a better quality than what was available in Blender before Mantaflow.
At its beginning, Mantaflow was supposed to be accelerated by CUDA. And it quickly stopped to support CUDA.
In early state of Blender branch, you should be able to export a mantaflow script that could be analyzed by a simulation farm.
I have hope that when sebbas will solve basic behavior bugs ; he could work on a better performance.
Many software are accelerating FLIP or Euler simulations through GPU.

That would be a different problem for smokes.
Mantaflow smoke sim is supposed to be close to what was old smoke simulation (added in 2.5, 10 years ago).
Mantaflow noise is supposed to be better looking because it is producing a more coherent distortion of the flow. If noise scale is small, result should be close to overall shape without noise.
But it is true that is noise scale is high, distortion may be important.

I think that the best practice to maintain predictability of result is to use a particle system as smoke emitter or/and a guide for liquids & smokes. But that is more work, too.

I will not argue that other better solutions are existing. That is an evidence.
But Mantaflow is better than old Blender liquid simulation that was present since 2.40, 14 years ago.
When Volume Object will be used as domain, the workflow should become more natural and easy to manage OpenVDB grid.
Sebbas tried to do something that was familiar to old users. It looks like it did not work.
Anyways, an evolution of Mantaflow workflow is scheduled.

Right, it doesn’t scale the domain, I never said that, but it’s creating more liquid inside the same space, which is counter intuitive and not what a user would expect when changing the resolution.
Just try it out with a simple scene of some liquid filling a bowl and you’ll see what I mean. The amount of liquid changes drastically when you change resolutions.

Anyway, I won’t argue the technical details because clearly you understand this better than me, but so far what I’ve seen is that the Mantaflow implementation is not meant to be used in production. It may be working fine and as expected from a development point of view, but absolutely not meant as an artist-friendly tool.

When you’re doing this for work and deadlines are tight you need predictable and fast results, and that’s currently not possible with mantaflow.
Especially if you’re a freelance, as in my case in which I’m expected to deliver a finished result from A to Z, there are no departments with specialists at every step of the process, and Blender is supposed to be a tool aimed mainly at freelancers and small studios… go figure :man_facepalming: :man_facepalming:

OK, I tried quick liquid preset on default cube at 32 and 64. I think I get it.
You are talking about collisions with domain increasing volume of liquid.
That is an issue that can be eliminated by disabling Adaptive Steps option.
This option is supposed to provide a faster baking of simulation by skipping substeps ; and sometimes, gives better results with constant inflows quitting domain.
Again, a default that could be pertinent for smokes but is not for liquids.
Counterpart is that is increasing simulation instability and clearly, rising liquid volume at each collision with domain or a collider.

Yes. Clearly, I agree. You have to play a lot with Mantaflow to have a chance to obtain desired behavior. And it is far to be fast workflow, anyways.

Unfortunately, it happens whether this option is enabled or disabled. There’s still a very big difference in the amount of fluid that is being generated. Even worse, with adaptive steps enabled what happens is that the fluid starts to disappear after a while.

Mantaflow may pale in comparison to what you find in Houdini, but it is usable enough to create decent results for still images and simple animations. It is definitely a lot better than Elbeem was at least. It should also be noted that improvements will take time, as devs. who can work with these libraries don’t come around often.

To believe no one should be allowed to use a feature because it has not met your personal needs is a very dangerous sentiment to have. If the devs. listened to that, then we would have to purchase Maya because the BF would remove features until nothing was left.

I am serious about that last part, there are some out there who still believe Blender is one of the worst things to ever hit a desktop (ie. not only a toy, but a broken one that can’t make anything).

Accually didn’t know that. I jumped to Blender with 2.8 release.

I wasn’t refering to smoke noise per se, but i get what you are saying. Thing is, this is not layed out clearly in manual, not to say about lack of any visual examples. And without good documentation, and poorly named UI - knowing what parameter is doing what exactly - the user is shooting in the dark until by trial and error he/she learns how entire system is behaving. This is extremely frustrating for a user and shows inefficient if not stupid design choices - sorry if it sounds harsh. There is no predictable workflow OOTB. I dont mind one function changing two things, but the description and good, clear exmples should be provide in user manual in that cases.

If I should change noise to adjust resolution of the smoke that is a clear problem with the design. Yes, I tried setting noise to very little value, and it in fact is doing what you are saying, but in that example when I’m using noise to de facto set resolution, how I’m supposed to distort the smoke and give it some visible noise? This is a problem with one setting changing two things at the same time.

Didn’t try that approach. I was affraid that it will increase already complex scene to the point it wont work at all.

That is very good to hear. If there are things I can voice my support for - real life workflow reasearch, fixing existing bugs and good documentation with visual examples would be the things to take care of first. Even before performance.

2 Likes

If you’re willing to waste literally hours trying out settings until something randomly works lol. The client job I’m doing now and brought me here is a series of still images for advertising, nothing you would consider complex at all. And yet, I still haven’t been able to get the results I need just because Mantaflow is unable to give me a way to visualize things quickly in a lower res and then having the same shape and behavior in high res.

Even more dangerous is to believe a tool can be used for actual work only to find out later (and with the deadline much closer) than it can’t be used at all.

Blender is my main go to tool for pretty much 90% of the work I do, I love it.
But some things shouldn’t be added to master if they’re not ready, just as it was decided for the asset manager that it’s not ready yet, it should’ve been the same with Mantaflow. That’s not an opinion, it’s just not ready.

1 Like

No, it’s not usable. It’s not just about the output quality (which is still abysmal) but also about the workflow, which is completely broken due to many unaddressed bugs. The amount of effort, time, and carefulness to prevent errors one has to put into mantaflow to produce even mediocre results just can’t be justified in any way.

I am not saying no one should be able to use it. I am saying it should have stayed a separate branch and the old system, which had worse quality but much more reliable workflow should have been in place until Mantaflow proved to be usable. Whoever considered Mantaflow in a good enough state to be merged into main branch and replace the old simulator should never ever be allowed to make any kind of decision ever again.

EDIT: To expand on what @julperado said. Blender markets itself recently as a software which can suit commercial production, sometimes they go as far as saying it’s a viable replacement for established, commercial 3D software. With this level of claims (which are mostly justified), it’s just unacceptable to ever ship with features in a state like this one.

Many professionals can end up in a seriously bad situation where they expected Blender to cover their production needs, including the “fluid simulation” checkbox checked on Blender’s theoretical, paper specs, just to find out it’s completely unusable when they actually try to get a job done with it.

I am not talking about removing features, but about having at least SOME bar of quality for their inclusion in the master branch. And the fact that the bar does not seem to be there is one of the biggest argument people who think Blender is one of the worst things to ever hit a desktop can put up against it.

2 Likes

Well, the snail pace of the workflow can get in the way if you have a deadline to meet (so I agree with that), but not everyone in this community is beholden to deadlines or commissions 24/7.

Still though, there are features in Blender that you do not think you will ever need at the moment because of the state it is in, until you need it. I would definitely choose Mantaflow over trying to make ‘fluid’ through metaballs and particles or even through hand modeling (at least for anything more than the simplest drip visuals).

Even then, the BF is at least trying to improve on the simulator, the debugging tools for instance give a better insight on what the issues are, and the APIC simulation resolves issues with certain flow effects looking ‘lumpy’ (not to mention the commit that greatly reduces visual issues in highly viscous flows).

*gulp. I wanna give it a try for real :joy:

This is the exact problem. There are some features people don’t need at certain moment, but know they are there in case they’d need them. It’s then all the more devastating when the time you need them comes, and you realize they are not in usable state, while you have relied on them being there ready for you.

It happened to me multiple times already - that I knew Blender has a certain feature on paper, which I never used, but hoped I could count on. And many times I broke my teeth on the realization that feature was in unusable state. I then had to buy some 3rd party software to handle the task, and had critical delays in the jobs I was doing.

Sebbas was facing a problem : Not enough people were really testing his branch and reporting bugs or feedback on UI.
That could be because only source of info were wiki pages of GSOCs.
So, sebbas made a request for integration in master to obtain more testers.
A dozen of patches were reviewed. Merge happened.

I thought it would stay only in master, tested and be released in 2.84.
So, I approved the integration to master.
And when I heard that it would be a target for 2.82, one month later ; I immediately regretted it.
I warned developers, here, on devtalk, on the chat ; that they should not release it for 2.82.
And, in the same time, I tried to help people, lost by lack of documentation.

When increasing of testers happened because of integration to master ; I was not expecting so many people to be confused by UI. I was not because I was following the branch since years.
I helped sebbas to make UI looking like old simulations UI.
But I was not annoyed by lack of documentation because I was expecting months of testing and feedback that would change and improve it.

I think that although UDIM was released in 2.82 ; they were lacking spectacular features to justify their stupid policy of a 3 months release cycle.

So, I would not disagree with you. The experimental tab workflow was done and they proposed to make 2.83, the first LTS, after this incident.
At least, lots of mantaflow bugs were identified.

Current situation is not ideal. But it is not dramatic.
Yes. People in charge of documentation has so many things to do. That Mantaflow part is still uncompleted.
But anyways, for this kind of complex and long work, I think that video tutorials are a better help.
Yes, Mantaflow has dozen of bugs. But for simple stuff several complex stuff, it can deliver great result.
And it is still possible to use a 2.81 Blender release to create a simulation, cache it as an OpenVDB file and import it in 2.92 as a Volume Object.
Without the Volume Object, situation would be really bad.

Blender is a successful opensource project that has a part of its success based on volunteers.
But sometimes, clearly, Ton is expecting too much from them and its employees.
Like they could magically change laws of physics and compress workload in time.
I don’t know why developers have this stupid idea that a 3 months release cycle is well working.
But that does not work at all, since decades. They are constantly postponing features ( I don’t remember if record holder is asset browser or particle nodes) or releasing unpolished or buggy new features (extrude manifold or Add Object tool) , making corrective releases ( they will release a 2.91.1) and blaming users for a lack of testing (of undocumented things without demo files in a ridiculously short period of time).

They have a Bcon3 bugfixing only phase in their release cycle. That was not sufficient.
They added 2 bug sprints weeks to it.
Users don’t have time to learn the software that they have to face up to new features changes.

What frustrates me the most, is that I sincerely think that a 4 months release cycle could work, smoothly. That would still be an insane rhythm of development.
I know there are 4 seasons per year. Many people have a schedule based on trimesters and semesters.
But what is really important to make release dates predictable, not the amount of months between them. I am sure that releases in March, July, November would be perfectly handled by most of people.
For that, we need a release cycle that is working. I can predict that with current release cycle ; there will be a moment where technical debt delay accumulated will be so important that they will have to skip one release for having to few new features or make a bugfix only release.
I am talking about facing reality. Developers should breathe and talk to users.
2 weeks of documentation and own testing of their work + 2 weeks of polishing and replies to user feedback.
I would place 1 week of each type in the middle of Bcon1 and 1 week of each type at the end of Bcon3.
Currently, developers that are under pressure are releasing their first iteration of target and waiting for next release to polish it. Sometimes, completely ignoring user feedback. Sometimes, postponing because they have other urgent tasks.

7 Likes

Thank you for broader picture. I hope the issues with the release cycle will be discussed and solved internally. 4 month cycle fits nicely in callendar year, so from user perspective, I see no issues. I wouldn’t mind even 6 month cycle - if the alpha is provided. This could even potentially remove LTS out of the equation.

This is bad.
I’d really rather see issues like high-poly mesh editing fixed than have half-baked geometry nodes with crippled mesh editing.

That might be the best thing that could happen. If this kind of cold shower is needed to shake the system - so be it. The only important thing is if that kind of situation would change the strategic, long-term decisionmaking inside the dev team.

From user standpoint that solution sounds good. There will always be projects that are to big to fit in that timeframe, but in that cases they should be put in experimental tab, so the community can help with feedback. Also every major experimental feature need to have obligatory thread on dev-talk. There could be separate thread category for it.

OpenVDB is great and the ability to transfer volumes between scenes is more than useful.

Sorry for off-top.

Here we already have the documentation of the new features that will come in Blender 2.92:
https://wiki.blender.org/wiki/Reference/Release_Notes/2.92/Physics

Anyone with time could download Blender 2.92 beta and test for potential bugs:

1 Like

Hey gang. I wonder if you could help me figure out a workaround for a problem i’m having with loss of volume / mass conservation.

Breakdown:
I have a glass container (very simple), with a proxy collision mesh/effector that is basically as simple as a cylinder, subdivided a couple of times, and with a fairly large solidified thickness to avoid particle leakage. The glass container is moving - fairly slowly but consistently. And I’m getting a loss of volume.

I know this appears to be a general issue with FLIP - that FLIP has no conception of mass or volume.

I’ve tried a bunch of suggested things like: increasing Particle Radius - the tooltip essentially says it should help with volume leakage, but it doesn’t work. I don’t think it takes into account movement.

I’ve tried increasing resolution divisions, increasing substeps, decreasing CFL, increasing Timesteps. I couldn’t get any better results. I thought maybe using an inflow might help, and maybe this is the way, but it seems like a whole lot of fiddling around.

What’s more is that Houdini users seem to have similar issues, because the fluid solver is also based on FLIP. So for any moving glass/cup, people face volume loss.

Am I crazy or is a moving glass/cup a totally common scenario for fluid simulation?? How are people dealing with this??

thanks

I have the same problem but unfortunately I don’t think there’s a way around it now. The 2.92 builds have the new APIC method though, which is supposed to help with small scale simulations, maybe try with that instead of FLIP.