A proposal for a new method of linking in Blender

This proposal has already been submitted to RightClickSelect and has received very positive feedback.

The Blender developers reassure me that 2.8 will address all of the problems I had with linking in Blender that inspired me to create this proposal in the first place, however I’d still like to get the feedback of the BlenderArtists community to see if there’s anything I’m missing, or anything the developers might be missing.

If you can think of a way to improve these ideas please give me your comments below. Thanks.

You can also view the original proposal (at full size) HERE.






The RightClickSelect proposal page: Click here

6 Likes

what happened to ricky? is he ok? did he ever recover?

He was diagnosed with blenderrhea and was on the toilet for the rest of the week, his links looking very sickly indeed. He was put on a dose of CQ 2.8 and should be fully recovered by the end of the year. Thanks for your concern. :wink:

1 Like

hello Matt, what a funny illustration you’ve made !
What you’re describing is very like how I imagine it’s working in Maya. Blender isn’t designed to work like this. But maybe with overhide system that should be doable. For now it’s better to have a master file (char.blend) that contain model, rig, shading ect… and obviously two people can’t work on the shading and the rig at the same time. What is needed from a pipeline tool is a way to lock access to the file when someone is already working on it.

I’ve worked on various projects (animation series) with teams of about 20 people and the way blender work is fine for that kind of projects. Maybe on a bigger project things can’t work like this anymore…

What are the bottleneck in blender’s way of doing things if you use the linking workflow like it’s done on the open movies ?

cheers !

Hi Sozap, thanks for your feedback. Can you give a brief explanation about how your pipeline of 20 people worked? Once you do I will be able to comment. How were changes dealt with that affected multiple shots? If a rig changed did that mean the old rig needed to be swapped out and its animation copied over? How successful was this to do? Stuff like that would really help explain what you mean by a typical Blender pipeline. :slightly_smiling_face: Cheers.

I love this. I love your scenario and this idea is honestly one that has the most potential of all ideas on this forum to make Blender attractive to the industry. Very well done.

I’d like to know what happens when an adjustment gets rejected.

Thank you, Claus! And thank you for your question regarding rejected adjustments.

Any adjustment made to a Master File is pending until it is approved by the Owner of the Master File. (A gatekeeper, if you will). It is up to the Owner to approve or reject adjustments by others. If rejected the Owner can write a reason why and both a notification of rejection along with the Owner’s reason is sent to the person wanting to make the adjustment.

This is very similar to how code is reviewed, wrangled, and merged in a team setting. It’s always seemed strange to me that more places don’t adopt the same method for asset management and versioning.

Hi,
If there is a big change in the rig then I make a new version, old shots keep the first version.
It’s possible to automate the version switching with a script so animators can upgrade to the new version if needed. You just have to change the library path and if the controlling bones are the same the animation is kept without issues.

If it’s a small change, then I keep the same version but I add a switch in the rig that enable the new functionality.

In fact it’s always a case by case basis. I generally avoid making new versions. Sometimes you can make an adjustment that break some animated shots, you then have to send them back to animation but it’s really a rare case and you can see it coming.

I can’t remember of big issues on projects when updating rigs, it’s never as simple as that, but it never takes a lot of time or create big troubles.
One thing that can get annoying is that some properties of bones doesn’t propagate well when you change them in the master rig.
Ex : rotation order : once a rig is linked if some bones are set to quaternion they stay the same in the shots.
This goes the same with custom properties and some constraints on the controlling bones.
You can set some bones layers to be “protected” and then the changes are propagated but then you can’t change their values in the shots.

Here are two projects that I’ve been working on if you need to see , because what we are talking about is very theoretical, I guess what worked for me can fail in other cases :

On this one, characters and props are 3D, sets and some props are drawn in photoshop. It’s then composited in AE.

This one also that is all done in blender :

Cheers !

Another awesome proposal that looks great on paper but incredible hard to implement in practice.

Essentially what you propose is a Version Control System for Blender data objects. Blender data objects include materials , nodes, user preferences you name it.

Problem is that Version Control Systems are notoriously hard to implement mainly because of conflict resolution. You have already adressed some scenarios but there are countless of other scenarios.

To give you an idea , Blender overall is a bit less that 2 million lines of code.

Git, the most popular version control system is half a million lines of code, problem is that git works only from command line … well… mostly. So you need a gui for it, but because of its countless commands, TortoiseGit which is a GUI client for Git is another half a million lines of code. So you end up with half the code base of Blender. To give you an idea of the scale of such a project.

https://www.openhub.net/p?ref=homepage&query=git

https://www.openhub.net/p/blender

Which means a proper implementation for Blender for the “simple scenario” you just demonstrated, taking into account the amount of dev developers available right now , assuming they work full time on it and drop 2.8 altogether will take without exaggeration around 5 years.

You can get a level of Version Control System without trying to implement Git and TortoiseGit for Blender by using Git from inside Blender , but that would require a text based format for all data which is also not easy to do because , Version Control System are focused predominately on text not binary files, which is the majority of the file formats we use in 3d graphics.

Saying that you could get maybe 1-10% of what you are suggesting as an addon using git or some other VCS. Good luck finding a dev to do it for free, you may need less luck if you are willing to organise a fat kickstarter for at least 30k dollars which will guarantee at least a single developer for at least one year full time work.

You also need to make sure the coder is very good at coding , the guy the created Git , created Linux (kernel). So yeah “a bit” difficult task :smiley:

I think he’s talking about how blender handle data so it can be used with a fine grained VCS system, rather than implementing a VCS system into blender.

Instead of having a character.blend file where there is the geometry, rig and shader, you’ll have separated files : char_geo , char_rig, char_shaders
Actually it’s hard to import/export just one piece of data from a .blend file as it can be done in other applications. You always end up by linking / appending from one file to another. You never export your data to something (ex only the mesh + modifiers) .

I think blender handle data a bit differently than maya but I don’t see where it’s a real issue in production. At least on small/medium sized ones. ATM It’s better to stick of how blender is designed to work rather that forcing another workflow.

I was going to say that this thread pretty much describes git

The workflow does not change , because version control is unrelated to 3d creation workflow. Plus it happens in the background anyway so even in the case of coders they only interacts with VCS in a very limited way , thus it does not force them to code in a specific way.

The issue here is the different file formats will have to be implemented for this, those wont be meant to be used directly by the user but the system to synchronise data.

From the user perspective he will still saving his blend file but that wont be what the system will be using.

It may be possible for git to version control small binary data, the issue remains big data like high poly models.

It may be possible to export in text formats but text comes with performance loss because it has to be conveted back to binary which is why we prefer binary formats, especially when the data is very large.

There may be ways to do this also using something like Dropbox or Mega. In that case you will be dropping the version control features but you have still data synchronization, those systems excel at synchronizing big binary data.

It’s doable but I dont think it is easy to drop it down to a small project, will still need a 100/200k lines of code mainly because there is the gui part and the server handling part.

So its still a lot more complicated than it looks.

You cannot simply export text files and let git doing everything else, because git is made to work with code, this is 3d graphics. So no a diffirent file format will hardly be a small fraction of the effort.

Yes, I agree !
I think maya can export data in a human readable format, indeed it can be hard to implement this kind of functionalities into blender. But maybe it’s kind of hackable and doable as a in-house addon.

In his proposal, he doesn’t speak about version control, only approval for modifications.

Version control is another kind of beast, because you can link a V01 of a character in one shot, make a V02 of that character and link it into another shot. Depending on how the pipeline is setup you may want to access different version of the same asset in different shots.

Yes it does not talk about version control but don’t be fooled by their name. Version control systems are far more popular for their mature conflict resolution tools than version maintainance. Conflicts are to be expected frequently with people working on same set of data at the same time which is exactly what is described.

Conflict resolution is a huge pain overall even with a very good version control system.

It can be avoided by locking down the data making accessible to one user at a time for writing and anyone for reading but then you have what you feared of , the obligation to follow a specific workflow , which can be tedious and slow.

@sozap & @kilon - Thank you very much to the both of you for discussing how the Mr.Tapll proposal might be implemented on a technical level. I enjoyed reading the back & forth between the two of you about how difficult such a task might be accomplished. I found it personally fascinating, so thank you again.

The Mr.Tapll proposal was deliberately vague because it approached a problem from a conceptual point of view first, and the reason why it needed to start there is not to test the technical feasibility of the idea, but to test the idea itself. If the core idea was good and true it would provide a clean workbench to then start drawing plans and making strategies on.

From what I’ve been able to make out, based on the feedback from this proposal, it’s that people really like the idea, to the point where some even ask why such an idea hasn’t been done before. That’s encouraging, and I hope it’s encouraging enough to the Blender developers to incorporate this idea into their thinking.

Here’s why that alone is really important: If you start a technical journey from the very basics of what a user wants, and start to build around that as your centre you will complete your quest with everything supporting that central point. (This is better articulated here: “You’ve got to start with the customer experience and then work backwards to the technology”).

The problem is that Blender has already been started (back in 1998), and now users are asking for lots of things to be the central focus and the devs are doing brilliantly at adapting their software as best as they can to meet these demands. Kudos to them, by the way.

But as things change and evolve if you’re not thinking about the goals of where you want your software to be in 5-10 years time, then you might develop your ideas to a place where it’s extremely difficult to change: because the decisions you made along the way didn’t allow for that possibility to eventuate. Making small adjustments now, may prevent the need to make massive changes later.

kilon, you kindly mentioned that Mr.Tapll was an “…awesome proposal that looks great on paper but incredibly hard to implement in practice.”

At this moment in 2018, yes, it’s an unrealistic minimum of 5 years away if the approach was to implement an integrated Version Control System into Blender, with all the bells and whistles included. It could be argued that the reason why it’s incredibly hard to do this is because Blender never started with the idea of having an integrated VCS in the first place. Now, 20 years later, its structure doesn’t support the possibility of something like this very easily- thus it’s a massive change. However, if the idea of an integrated & adaptable pipeline was part of its founding vision then, even if the early editions of Blender didn’t feature this at all, the structure would have always been building towards such a goal, and in 2018 implementation of the idea might be easy, if wasn’t already part of the software.

Hindsight is always 20/20 however and I don’t fault the devs at all by saying this. It was merely to illustrate a point.

Blender is great, I love Blender, however, the entire point of the Mr.Tapll proposal is to encourage the devs to begin thinking about how something like Mr.Tapll might be implemented in the future. It may take small changes in Code Quest 2.8 to help pave the way for Mr.Tapll to exist in Blender 10 years from now, for instance, but if these changes were to be left out of 2.8 then Mr.Tapll might be 12 years away instead, as an example.

Right now a much simpler approach (than direct implementation of Blender’s code) is to implement third-party software to help reach an experience of what using Mr.Tapll might be like, (for instance, Blender Cloud, or Shotgun) but even having the Mr.Tapll proposal exist also helps those third-party software developers better evolve their ideas to what the users want.

I hope you can both appreciate that. :slight_smile:


@sozap - Thank you so much for your wonderfully informative reply! I’m very impressed by your work, by the way. I particularly like Ella, Oscar & Hoo. What a beautiful style… Gorgeous! The way you have accomplished this is well hidden from the viewer. I still don’t know how it’s done, but I’m suspecting it’s mostly 2D rigs.

I have an unrelated question: Were the storyboards in Transformice done in Blender also? If so, what was that like to use as part of your pipeline process? (Perhaps that wasn’t completely unrelated after all).

hello,
Thanks a lot, and it’s great to read such a positive answer.
Mr Tapell looks indeed like a great tool similar to shotgun, blendercloud or CG-Wire (open source) .
Maybe the only complicated part is the fine-grained part , keep track of a rig file that depends on a mesh file ect…
With a less file grained tracking system, I guess you’ve got everything you need already inside blender.

Maya allow such a fine grained workflow, I’ve asked a friend of mine about it , and he says that’s a bit overkill to work like that. Generally you have all the mesh, rig and shader inside the same file, pretty much like we do with blender.

And about Ella, Oscar & Hoo, character and props are 3D , it’s similar to how setups of Blue sky’s Peanuts are done. The rig include several heads that are made for a special point of view. That give a lot of 2D feel .
And on Transformice storyboards was done on ToonBoom Storyboard pro. Now that grease pencil is getting better it may be a great replacement. It also depend a lot on storyboarders willing to change their habits.

Actually a lot of production back in the day used to work very similar to this. Maya can split data to several text files (which works with versioning systems) at the cost of disk space and memory. Both are no longer at a premium like they used to be back in the day but when you are working on large heavy scenes dealing with ASCII file becomes a pain in the ass.

Anyway most large studios use things like Shotgun or Alienbrain. Shotgun has a few elements of what was proposed here but at a very sophisticated and fine grained level. There is even an open source one called openpipeline for version control for production.

I like the proposal though. It seems like a good workflow for smaller decentralized teams. Bigger more in-house teams would have an established pipeline anyway.

1 Like