I got the ideas exactly from that.
Here is the link:
I will paraphrase in these bullet points:
-
interaction is key. Blender should continue to have better and better interaction tools.
-
this will be accomplished by various optimizations but not by increasing Blenders ability to arbitrarily handle large amounts of data.
Full stop. We have been asking and asking and asking for better handling of data. Like forever!
To my point when you are working with high quality assets like are used in “real productions” “large studio” or “ambitious freelancer” and not cartoons… You need to handle large amounts of arbitrary data. Be it to retopo a Zbrush Sculpt or creating a complex landscape or industrial environment.
This is the increase in interaction we are asking for. Because not just for our own needs but because until Blender can, it will not be able to work comfortablely in a professional pipeline with high demands.
This is not an opinion. It is demonstrable fact.
And so a pipeline has to be smooth from one side to the other. This is why professional packages are using “the other method” and try to accomplish this throughout the pipeline.
XSI used Gigacore… Maya not as good. But instead of refactoring they worked out a viewport cache. Works ok… I will have to play wit it more…
Bottom line is it has to be a solution from end to end… Not…
- A new Rendering editor designed to handle large amounts of arbitrary data.
Edit: Even the LightWave developers knew this. They developed Hydra which was something that is in Chronosculpt and also started with CORE. Rumor is that it exists in LightWave 2018 and beyond…
It was to be the first step in completely refactoring LightWave. Giving it a modern mesh and data handling engine. There are some demos of it in action over in those old Blogs.