Blender Edit Mode Performance

I use it for modeling all the time. It provides really helpful visual feedback about how close objects are to each other.

1 Like

isnt it a place to report/discuss share experience with actual status…:wink:

yeah it is.
But i don’t go into the garage right after they removed the engine from my car and “worry” that it might not be driveable at that moment.

3 Likes

Interesting. I´ll give it a try.

yeah, but some test runs might be very important. switching the motor again after complete asssembling might be not that smart just because you forgot to check if it will fit gear ratios …:wink:

i agree! test runs are important.
but being scared and worried based on the current performance makes no sense.

hopefully you’re right…

BTW I really wonder if viewport performance in editmode will boost when local view is back

Completely agree , Blender is nowhere near to a stable release and there is still a ton of things missing. There is no need to rush, Blender 2.8 already has some great ideas going for it and I really love what I am seeing.

The good news is that its slowly getting there and its a huge step forward compared to extremely limited Viewport of 2.79.

Personally I dont expect a really stable release in 2018 for Blender 2.8 , I will be glad if I am proven wrong , yesterday I took a loot at the roadmap and it does go very well but the road is long and hard.

We should not forget 2.5 , also a huge step forward with tons of bugs and problems that were eventually ironed out to something far more stable but it took a couple of years. I don’t think Blender 2.8 will be much different.

Of course we did not got much complain with 2.5 because users were all too happy to see all those improvement even with the instability.

But there is always people who want perfection for free.

?? without testing/discussion/evaluation there wouldn’t be any perfection…

Rather than “perfection” it’s:

We don't want regression.

1 Like

Coding is all about taking a step back to take two steps forward. At times it feels like chasing your own tail but it tends to work out in the end.

Problem is users rarely appreciate the effort because what the user experience is the tip of the code iceberg , much of it buried deep inside.

So be patient, it will get there… eventually :smiley:

7 Likes

It’s good to make some tests right now to see the difference with the final version.

About Edit Mode:
cpu
Surely there are still missing CPU optimizations that will come after solving Crashes.
I have not read anything about projects to optimize those modes or functions where CPU is hard used. So at least for now what is expected is that 2.8 is not worse than 2.7. Do not make false hopes about many improvements in this regard until developers officially mention something about it.

1 Like

Anyone knows if the raw viewport performance especially in editmode (orbiting, selecting, moving) will benefit from mutlicore cpus? I allways thought it would mainly depend on gfx card power…

I do not know.
I forgot to mention before that the graph of CPU usage was when move vertices.

By the way, to do tests anyone could monitor CPU and GPU while keeping monitors apps on top while working with Blender.

I don’t think that CPU usage reflects the actual viewport performances.

Ignore the RAM usage, Houdini has the whole scene opened.

7 Likes

I do not know other programs, but in Blender developers have explained these things many times:

Depending on Object, Edit, Sculpt Multires/Sculpt Dyntopo Modes, some things are multi thread CPU, some other things are single thread CPU, and other things are GPU tasks.

2 Likes

In theory it should be, there is still a lot of dependencies on CPU for performing features that programming languages need to do the basics.

But there is a ton of things when it comes to performance, too many threads, lack of usage of hardware acceleration instructions (see SSE, MMX etc) , I/O access, memory leaks and memory management.

In the end its the GPU that plays the larger role but modern CPU have accelerated quite a lot lately taking advantage of their gpu like technologies. Questions is how much Blender can utilise of that when it still have CUDA issues.

For example ZBrush has been able to support tons of polygons from its very early days because it was probably the first 3rd app to perform RAM compression. It’s a bit like cheating but it allowed for million and then billion of polygons. This is only one of countless techniques. It has nothing to do with how well it takes advantage of CPU/GPU.

The art of optimization is pretty close to necromancer magic, its dark, you have to sacrifice code readability and is a pain to maintain like zombies keep coming from their graves.

Hence why developers avoid it like the plague.

2 Likes

Thx for the info. I think max uses kind of the same “cheating”. Especially with quadro max performance driver, 2009version was awesome fast, but when using snapping in extreme zooms, it doesn’t snaps precisely no matter the units setup. Now with nitrousvp it also works fast with consumer gfx card but snapping is still not tight like in blender. Seems like a trade…

Someone with a fast multicore CPU could check if effectively, more cores equals to a faster viewport, because as you could see from your graphs, both versions don’t use all the cores.

I think that Max has a problem with the single precision floating point snapping algorithm (unless they updated it and it’s still buggy).