I’m meaning if you have an idea of how particle information accessed by other areas of Blender could bring in new features.
So we’re getting new Jahka particles, but what if
1). The location of the fully opaque parts of settable resolutions of procedural textures in 3D space could be made available for read by the particle system
2). The mesh code can use the particle locations to create triangles and have access to points of where procedurals set would be opaque
In this case the particle system can be used to arrange thousands of particles into the shape of the opaque parts of the procedural texture in a 3D space (not on a mesh), and then the mesh code accesses these two other areas to create a mesh framework in a rough shape (depending on the resolution of the particles) of what would appear to be a true volumetric texture if the resolution is high enough. Not voxels, but this could work so they don’t need to be added as of yet if the renderer is worked to render very large poly counts efficiently.
An example would be a cellnoise texture, the particles read the opaque areas, they arrange themselves in the shape of the texture in 3D space, the mesh data reads the particles and the texture information, then the mesh is created using the particles and the other info as a guide. For reasons to not overcomplicate things partly transparent regions in the texture will be ignored.
And another thing regarding particles.
An idea to get SSS in particles, we could suggest something like a sphere of influence for each particle, then use something very similar to Brecht’s SSS system only the effect would be most concentrated in the center.
And an idea for reflected and refracted particles, I’m not sure how this one could be solved, but if the raytracer had access to the particle positions and rendered them as if they were meshes with a UVmapped blend texture.