Blender's "Editor" Interface Paradigm, Explained

I created this thread to explain the “Editor” paradigm that Blender’s interface is based upon, and its benefits. Understanding the rules and guiding principles of the current interface system is vital in any productive discussion about the UI, so this thread aims to inject more clarity into the debate.


Blender’s interface is based upon a unique paradigm where an application window is split into multiple “Frames”. A “Frame” is an abstract division of the space that the Blender application window takes up on your computer screen. The stuff that’s displayed inside of a frame (the actual contents of the frame…all those buttons and panels and functions and whatnot) is called an “Editor”. These (The Editors) are parts of the software which are dedicated to specific functions (we know them as the “3D View”, the “Node Editor”, and others).

The concept of the “Editor” is the foundational basis upon which the interface is designed (and the basic organizational unit for grouping features and functions). Each Editor, generally, contains a “Header” (similar to a traditional drop-down menu, like the one at the bottom of the “3D View”) and “Regions” (these are the side panels that pop open upon hitting “T” and “N” in the 3D view), as well as a space to manipulate and view the stuff you’re working on (the “Workspace”).

From the wiki:
–“Blender’s interface…allows you to customize your interface to suit your needs…Your screen can be organized exactly to your taste for each specialized task.”
–“The 3 Rules (The interface is based on 3 main principles): Non Overlapping, Non Blocking, Non Modal”


By tying together related features, functionality, shortcuts, and the work area into a localized, single area/space on the screen for any task, the “Editor” paradigm allows Blender to more effectively wrangle an otherwise unmanageable set of thousands of complex features, options and functions. Because each “EDITOR” has its own drop down menu (“Header”), the problem of overloading a single top menu bar is avoided.

There are other interface paradigms that exist today, and many have argued for their implementation in Blender. One of the most familiar is a “Top Drop-Down” or “Ribbon” menu above, with a single unified workspace below (such as is in use by Microsoft Word and others), sometimes making use of floating menus or panels containing groups of related options. Many Adobe programs use a “Palette” paradigm (like Photoshop), with a unified tabbed workspace area.

While every interface paradigm has pros and cons, I think Blender’s paradigm is robust, versatile and powerful enough to accommodate all of its features now and well into the future. Yet, exactly how the program is structured and organized visually is a separate but related matter of how to most effectively apply the underlying design principles. Here, there is much room for improvement and debate, with many of these opportunities related to the need for consistency and standards.

I hope you’ll agree that while Blender’s Interface could benefit from many changes/improvements, the underlying “Editor” Interface Paradigm should stay. We can develop/evolve an effective set of standards that fits within this paradigm, solves many of the existing problems with Blender’s interface today, and allows enough flexibility to tackle problems tomorrow.

Links:
http://wiki.blender.org/index.php/Doc:2.6/Manual/Interface
http://wiki.blender.org/index.php/Doc:2.6/Manual/Interface/Window_system
http://wiki.blender.org/index.php/Doc:2.6/Manual/Interface/Window_system/Arranging_frames
http://wiki.blender.org/index.php/Doc:2.6/Manual/Interface/Window_types
http://wiki.blender.org/index.php/Doc:2.6/Manual/Interface/Window_system/Headers
http://wiki.blender.org/index.php/Doc:2.6/Manual/3D_interaction/Navigating

Moved from “General Forums > Blender and CG Discussions” to “Support > Tutorials, Tips, and Tricks”