Discoverability of Blender interface, is it an objective of Developers?

When you said they are technically overlays as well.

Ahhh yes, my bad I meant the opposite excuse my english : they were split out for convenience rather than technicality. Up to the beta I believe they were under a single toggle which made it quite inconvenient to disable one while keeping the other.

I think you have to get used to the fact that each software has a different philosophy and way of organizing displays. Not to mention completely different feature sets. In Blender’s case, they made the choice for this to be an overlay when made globally. Which makes more sense than it being a shading option. Wires have nothing to do with shading because they can be visible in all shading modes. By your way of thinking, we would have two options for each shading mode. As it is the option is still right there in the top corner right next to shading options with wording that is real clear as to what it does in the little pop up hints.

In many other software you are very limited in your choice. It is either wireframe, sold, wireframe solid, then textured and lit and so on. That is why they are listed together.

Blender’s way of doing it is different. So in this case it makes complete sense for it to be non obvious since it is a feature. Not a UI decision.

1 Like

I do recall reading in the Modo forum as to how people were impressed when their models looked a lot more pleasing to the eye in Blender 2.8 than Maya and also their native Modo (because of more modern drawing features even for things like solid mode).

The tricky part is convincing the industry to leave the legacy workflows and displays (that pros have been accustomed to and comfortably using since the 1990’s) behind, sometimes a new workflow is embraced like PBR, but not always. Not only that, but it is tricky to encourage them to pick up potentially useful features they never encountered before (ie. Blender’s 3D cursor, which actually saw significant improvements in the last few years).

It is true that some new ways of working just didn’t work (like right-click select), so I’m not saying that everything that is ‘new’ will be superior, but many other times it is superior or has the potential to become so.

1 Like

Really? :smile:

Heh, you should give it a try

1 Like

I did… and the more I do, the more I hate it… :smile:

It has lots of shortcomings, and with a little love it could be much better I agree to that. Hopefully some fund money goes into it, along with the snapping improvements, that could hange the way we model in Blender.

I disagree Richard . Your text imo also makes it clear. It is the propose, placement in workflow that matters.

For example why Eeeve is now there in viewport icons instead of solely a propose build menu?

Answer: Its speed makes it possible to use it for creative proposes in many workflows. So it can be used often. In the past a render engine like that would be a more formal thing because it would be much slower.

It is still a render engine, but a change in performance made to change its placement in interface.

lol well you can disagree with me. But don’t kill the messenger. Wireframe is not a shading option in Blender, like it is in other apps. That is just how it is. You wanted it to be easy to discover. And if it was a shading option, it would have been right there where you expected it.

This is one of those many cases where choices made to offer more options and flexibility require an interface that makes those options available. As cleanly as possible. And in this particular case. Blender actually has it right. One place to add a wireframe overlay over all shading options.

Look. You see, stick around long enough to get to know me. I am not being a Blenderhead fanboy. Anyone here can vouch for that. So please don’t assume I am just trying to defend Blender against all odds.

Moving on to other subjects, I am sure you and I will find other points of agreement.

It is true on this particular point, Blender has moved forward. And this means a new way to address it in the interface. And of course tripping people up.

But this is always why I make it a point to read the manual on the interface when I learn a new app. There is always a convention that is different from app to app. And it is good to get over that ASAP.

1 Like

I don’t dislike the 3D Cursor, but I do think it is wasted potential. To me the 3D Cursor needs two things to become an incredibly useful tool.

  1. Create a free transform gizmo for the 3D Cursor so the user can move it anywhere he or she pleases with more precision. This does multiple things. The user can now put the 3D Cursor anywhere they want, even inside models, and they can now add new meshes and more anywhere. This also indirectly improves all the gizmos, since users now can move them freely thanks to Blender already having a follow 3D Cursor option.

  2. Add the ability to add and save multiple 3D Cursor positions in a scene. Also, add Transform gizmo support so you can have multiple gizmos for different purposes instead having to manually move it every time.

Add those and the 3D Cursor will be incredible. I would even want to have it in Sculpt Mode then. :wink:


Entirely agree. The “3D cursor tool” should straight up activate the transform gizmo by default (translation+rotation). I don’t think it would even be that complicated code-wise…
Concerning your second point there used to be an addon called enhanced 3D cursor which let you store different positions and recall previous ones through a history, that was great. It hasn’t been updated though. Basically turn it into a “favourite pivots” feature.

1 Like

It would indeed be useful to upgrade it to a full scene object that can still do the things we want it to do (ie. tool pivot and object placement with/without rotation). As a full-fledged object, it would do things like snap to vertices when a key is held down (in addition to the easier transformation).

Richard Culver in my world disagreeing with someone is not killing the messenger. Neither i think you are or not are a Blender fanboy, even if you are, a fanboy can bring a valid point. I also wish we can just discuss the points.

1 Like

That’s already the case, shift+RMB moves the cursor and holding ctrl snaps it. However I don’t think making it into an actual object makes a lot of sense, or to put it differently, it’s probably overkill : I think all the features that we’re fantasizing for it can be achieved with the regular tool.

Yes it is. But, the starting point was this - Blender 2.49:

Possibly the least discoverable user interface known to man. See that toggle button just called ‘A’ as an extreme example.

Other examples are things like this X button, which of course has nothing to do with the X axis - the examples are many.
Screenshot 2020-04-12 at 22.31.52

Prior to Blender 2.30, there were no menus, so users would have to know all the keyboard shortcuts by heart as literally the only way to perform actions. Also no undo whatsoever. In case you made a mistake, you had better have saved.

In Blender 2.79, the current version just a year ago, if you would simply click on objects in the viewport, it would not select anything. Instead, it would move around a lifebuoy symbol. To select items, you would have to use the right mouse button, if your mouse even has one.

Contextual menus would appear not by pressing the right mouse button, but by pressing the W key on the keyboard.

So, with the above in mind, I think you’ll agree that at least things are becoming more discoverable over time.


M is merge.

edit: but ctrl + alt + spacebar is fullscreen

So the legend is true. I’ve only known Blender from 2.38a on and it already was quite obscure. I clearly remember thinking, “why does this buoy move around when I click”, and then I tried right click and everything became clear. Nah just kidding ! I gave up and came back ten years later.

Hmmmm nitpicking probably, but I last saw a one-button mouse in 1998, on a mac that was already a few years old, and I remember thinking “wow those still exist” so the accessibility argument doesn’t really stand. I mean, Blender moved to left-click select because common sense, not hardware availability.


Multi-button mice came first. They had them at Xerox PARC in the 70’s. The more refined single button mouse came later. Designing the UI to work with a single button is of course much more difficult, so the first crude mice had many buttons.