FYI: Observable Notebooks Using Blender Generated WebGL Images For Dynamic Data Visualization

A listing of notebooks published on Observable using Blender generated WebGL Images for Data Visualization:

  • All images had data embedded within Blender’s Custom Properties.

  • The data were then exported as extras within glTF files.

  • Finally, the extras were transformed to ThreeJS userdata objects for the data visualization.

The second notebook demonstrates several use cases and integration with Vega-lite for dynamic, data-driven image filtering. The third provides detail documentation.

Note: The third notebook also details how data can ‘drive’ an image EVEN IF data is NOT embedded in the image.

The requirement is that the data and objects in the image have a common key attribute. In essence, an objects ‘name’ attribute must match exactly with your data’s ‘name’ attribute for a data to image join to be facilitated.

That requirement means every object must be uniquely named and objects to be ‘driven’ with the data must have unique materials.

The following notebook demonstrates this pattern with randomly generated data on several WebGL images: