Maybe, your 2.79 scene contains a mesh with dupligroup option ON.
In 2.80, duplication was renamed Instancing. Groups are collections, now.
So, a duplication or instancing of a collection would correspond to dupligroup option.
Problem is that in 2.80, ability to make an instancing of a collection was restricted to Empty objects.
So, in your file, there should be a mesh (a plane or a circle, a simple schematic mesh) that is named tree_1936_ that is intended to display a tree but is not because of 2.80 incompatibility with 2.79 behavior.
You just have to find this object and add a collection instance of “tree source” collection at its location to obtain same scene as in 2.79.
Ahah. Thanks for the guidance. It’s a bit of a shame that this cannot be fixed automatically, but I will go through and try and fix things up manually. I’m a bit surprised that I couldn’t find something about this when searching around - is there a guide to things like this in the new release that I missed?
Using a mesh as dupligroup object is adding unused mesh data and memory to your scene.
That was counter-intuitive for mind of developers to do that.
So, when they made restriction, they thought that would not hurt users.
Then, I saw only 1 user complaining. Maybe there was a little bit more.
But it probably was not a huge amount of users, and developers thought that a warning would be sufficient.
I know that because I followed commits day by day and interact, here, with other users that were testing 2.8 during past year.
But between 2.79 and 2.80, there were thousands of commits.
That is difficult to sum-up everything into release notes.
Developers are trying to keep track of those changes by editing manual and release notes just after making them.
But they can forgot some or have to expel some to a release notes page to keep it readable after one year of changes.
It should have been mentioned into page of Collections. But it looks like it was forgotten or not considered as a frequent practice.
Let me understand this correctly: if I do instancing via a mesh, the instances count as verts, faces, etc. as well as using more memory but if I instance using collections, vertex/face/tris count and memory use is reduced, being counted as only one mesh? And all of this because the developers didn’t (for lack of a better term) “optimize” the mesh duplivert method?
Initially, they thought to replace the parenting method by something else.
In theory, that should be handled by Everything Nodes method.
Maybe, they will try to bring optimization or something new before Everything Nodes release.
But it looks like collections issues with depsgraph was a priority. And probably parenting method is using another depsgraph path.
Like many other things, they did not find time to solve it before 2.80 release.
And I’m discovering there are just so many things that haven’t been solved yet. So “the elephant in the room” asks…why release it then? A year of hoopla about 2.80 Beta just to be told after the release of 2.80 “it’ll be fixed in 2.81.”
Not parenting but duplicating by parenting.
If you try to use animations nodes, you will see that you can use nodes to duplicate objects one vertices, faces or edges of another one. Instead of making a parent/children relationship, you simply make a connection between nodes but you end up with same result and more options.
Everything that is not satisfying will not be solved in 2.81.
The whole 2.8 project is complete refactoring of the whole software.
It will take years. During 2.5 project, releases were alpha & beta until 2.57 which was not really complete because bugfixes of animation system ended up in 2.59.
UI choices were reevaluated, modified, reported or cancelled.
That took more than 2 years, and developers were forced to release several 2.49 releases to provide a blender release usable on recent OS with libraries up-to-date.
Although there are more developers working on Blender, now ; 2.8 happens after 10 years of expansion of Blender abilities. We are discussing and communicating a lot more fluently about 2.8. But we are discussing about a lot more of subjects. There is no reason that obtaining a smooth workflow in every aspect, takes less than 2 years.
Because people want to use EEVEE in production, since last year. And there are always people that never risk to try a beta.
Developers tried to achieve a release enough stable.
More feedback will happen, now. There is a chance that 2.81 or 2.82 does not look like 2.80.
It is not perfect situation. But it is a lot better than discussing without point of reference, or trying to work with people that don’t use same beta, not being able to finish a project because update to solve a crash produce another one in another area, confusing people with alpha releases indexed like stable ones (that have no buttons or are easily crashing) …etc
This is potentially interesting concept…for Blender 3.0 (or even v2.9.x). With v2.80 we’ve all seen what revolutionary instead of evolutionary has done to Blender.
As for your other points, I’m afraid I cannot agree. I myself started using Blender around 2006 (v2.41?) (I even filed a bug report ) and even though many of the changes from then to v2.79 were staggering, never did I feel I had lost some capability during the evolution.
You state people wanted to use EEVEE in production and I understand that perfectly, it being far superior to Blender Internal. But EEVEE could have been added to a new Blender release (think Cycles and Motion Tracking in 2.61) without any of the things that are half-baked now, such as the mesh/Collection instance incompatability that drew my attention to this thread. That alone could have been v2.80 and I’m sure nobody would have complained. Instead of being lauded for adding only a new powerful render engine (and possibly other functional new features/improvements such as the Grease Pencil) we have the never-ending beta test of today’s 2.80.
The Blender Foundation, for whatever reasons, bit off more than they could chew and, instead of doing incremental, evolutionary releases, released an revolutionary yet unfinished version that excuses itself by chanting the mantra “wait until 2.81”…or later. Well, what’s done is done.
@zeauro, PLEASE don’t take any of this personally; I don’t know if you are on the dev team, but I do hope some devs do read this. Regardless I tried to keep it succinct (and failed), but more importantly, intelligent. I apologize for tangentially rerouting this thread and thanks for reading. PEACE!
Situation is a little bit more complex than that. EEVEE is sharing some of its targets with new viewport.
And for new viewport, unlimited amount of layers was one target.
And depsgraph had to digest that, too.
So, they had to put at least those 3 targets, together.
IMO, they should have added dynamics override, too. I think that postponing that will have consequences.
I agree. I remember I commented many initial design tasks to try to warn them.
Workspaces + Multi-object Editing +… I clearly wrote that was too ambitious.
But when everybody is more interested to a specific task than the whole picture, it is hard to stop the train.
And it is not obvious when no line of code has been written, yet ; amount of time, it will take ; about what users will complain, etc…
I think their biggest fear was to forget to test one aspect of one big task that would have screwed everything following.
They did not want to make a new big target, make a release and realize that for next release, integrating another big target that they would have to rewrite completely previous target.
Blender Foundation is also attached to maintain a short cycle of releases because it permits to volunteer devs to see their work quickly integrated and keep their motivation.
Last year, after codequest, they were ready to call 2.80 the first alpha and continue with a series of beta, without any idea when first stable release would be delivered.
I think we avoided this situation.
2.80 is not as smooth as expected. But 2.81 will probably fix most of little problems. And 2.82 should probably be THE release we expected.
So, probably one year of wait, but 2.8x series will be a lot more reliable than 2.5x and each release will contain more big improvements than a 2.6x or 2.7x release.
In thirty years, for next refactoring of the software, devs who participated to 2.5 and 2.8 would be retired volunteer devs : they would share their experience. Things should go without problems.
Forgive me, but that’s exactly the mentality that I was referring to in my previous post.
At the risk of being repetitive, with all previous versions of Blender I’ve had experience with, never did I feel like I had to wait for big changes to happen. Each version delivered some new improvement with more-positive-than-not consistency. Fluid Sim, Sculpting, Motion Tracking, Cell Fracture, Cycles, EEVEE, etc. They were added bit by bit (no pun intended) and incrementally improved, with minor fanfare previous to each release. The main thing is, yes, even after the 2.5x interface changes, learned processes and muscle memory were never drastically affected. As far as I can remember, never was there such a build-up for (unmet) expectations such as the one for 2.80.
This is all water under the bridge; 2.80 has been released and will be improved upon as time passes, I’m sure. But as I said before, this “wait for v2.8x” mantra is doing nobody any good. Hopefully the Blender Foundation will learn from this.
If you promise two small things and deliver four, you look like a hero. If you promise the moon and deliver four things…well…
Allow me to clarify one important thing: I love the new Blender. By and large, I feel it is an improvement on the already-great v2.7x series. But…there are just so many papercuts!
Yes, there will be big changes. But it does not mean that we have to expect development to be faster.
Time for some task is sometimes incompressible.
If you dedicate more people to one task, work is done faster than with only one person, at condition that time of communication to organize team is not superior to expected gain.
For 2.8 , channels of communication were multiplied ; it means that developers are spending more time to consult them, are probably reading many more times same critics and remarks and are giving several times same answer.
They are harvesting more feedback. That increases number of papercuts but also number of feature requests.
So with more user members in community, there is more money that conducts to more developers but also more demands.
With lots of tasks, things becomes harder to prioritize.
2.5x avoided to change default keymap based on right click and filled empty space available on keyboard. There was a promise to deliver a new keymap for 2.7x. That was postponed to 2.8x. But that was inevitable. A keymap that uses all keys of keyboard without ability of customization for a newbie : It is a bad default.
Current 2.8x keymap is a tradeoff.
Ctrl Alt Shift + letter shortcuts were removed.
F keys were changed and only few standard shortcuts were adopted.
But there was no change about G,R,S
Probably 90 % of Tools shortcuts are same as in 2.79.
During 2.5x, muscle memory of your hand using mouse was a lot affected. Nothing were at same place. Icons were changed, too. We lost free arrangement of panels. We lost reactor particles. We lost the fact that dupliframes were taking into account shapekeys.
During 2.5x, users asked for Textures preview in Viewport and an Asset Browser.
People are waiting for Particle Nodes since 2.6x.
We have textures preview in viewport with EEVEE, now. Jacques Lucke is making progress with Particle Nodes. William opened a 2.8 task for Asset Browser.
Unless we dedicate a certain amount of our time to learn how to code to same level of Blender developers ; we are completely dependant of how developers define their priorities.
We are fortunate enough to be regularly consulted and listened by them.
But of course, if priority is always dominant choice of majority ; you can wait during a long period if your main interest is the one of minority. That is where democracy can become barbaric.
They are conscious of technical debt accumulated towards those minorities along these years. And they opened a discussion to organize themselves to reduce it for next releases.
So, it would be logical that a majority of users don’t see some new features from papercuts as pertinent priorities.
But if we can explain to them that some of them are expected from more than a decade by another part of community : that may be help them to tolerate a wait of one year.
I personally don’t remember it being as difficult moving to the 2.5x series, but that may be because I had started my journey with Blender just before and hadn’t yet developed a real working relationship with the software.
Pardon my ignorance but I never even knew about the Particle Nodes, losing reactor particles (I don’t even know what that is) nor the “broken promise” of a new keymap in 2.7x. Know why? Because NaN/Blender Institute/Blender Foundation didn’t put it in my face for a year beforehand! Maybe if I’d have sought out these things, read devs’ blogs and scoured the various sites that report on these inner workings (and that’s exactly what they are…inner workings) I would have at least been aware of them. Or maybe if I’d had a feature request, I would have been aware of them. Me, I’m pretty much happy when a software just does “what it says on the tin.”
But the 2.80 hype machine had been advertising the new version since at least a year ago. I don’t recall any previous version getting so much spotlight time. And that’s fine: an organization has to get the word out, and as far as publicising 2.80 the Blender Foundation in my opinion has done an excellent job. But if you’re going to do that, you’d better have a product that reflects what you’ve been publicizing and not an extended beta, “we’ll fix it in the next version.” And as I said to @Ace_Dragon, there are just so many papercuts!
Ton Roosendaal and the Blender developers for years and years have done an outstanding job. You can tell it’s a labor of love. I honestly can say if it weren’t for Blender (quirky as it was/is/will always be) I probably wouldn’t have even taken up 3D modelling. Me, I can’t code worth two cents (bash scripting/HTML is as far as I get)…but I like to think I can prioritize. Instead of new rendering engines, new layouts, new nodes, whatever, I’m gonna polish up nice and shiny what I’ve got and then add big new features, which I may hint at while creating them but not shout about. And, in closing, to be perfectly clear: that is the only real problem I have with Blender 2.80.
That’s it. I’m exhausted! Again, I apologize for derailing this thread. Thanks, @zeauro (you’ve made some valid points) and everybody else, for bearing with me.
2.4x reactor particles was the ability to emit particles from a particle system according to some events.
In 2.4x, user was able to emit particles from death or collision of particle.
Workarounds were added since that. They are implying to convert particles into real objects and instances.
But a real replacement to that is only happening now, with what Jacques Lucke is doing in functions branch.
Coming back to this, this seems like I’m in for an awful lot of manual work (to change 10s of thousands of affected scene elements, across multiple scenes). As such, I wondered if there is a way to convert selected meshes to empties; I then should be able to copy over the instancing block.
Anyone have an idea? Manually, I’m using selecting each mesh in turn, setting the 3D cursor to the location (via shift s menu), duplicating my empty that has the instance set up and moving that duplicate to the 3d cursor location. Then I can delete the useless mesh item.