NPR in Blender and the future of Freestyle

Hi all, been playing around with Freestyle today and I like it a lot. Looking around on the forum and dev site, I could not see if the project was still in active development.

Is it still being developed, or is there a more recent alternative of NPR being pushed?

I also use Cinema 4D in my workflow and it has a beast of a NPR in Sketch and Toon. I would love to get the functionality and power of that inside Blender.

I didn’t see any activity of the freestyle on the Dev’s site for a few months
And I have the impression that just rendering with some tricks, which I forgot, is faster (if you just want line’s). Someone has a nice portfolio here and it looks amazing. He is not using freestyle, because it’s slower.

To me it seems the dev’s are now concentrating on 2.8, Eevee, PRB viewport, alembic support, etc.

That is what I was thinking as well. I suspect with some node wrangling, illustrated looks are in reach. I will go searching for that portfolio, thanks.

you can use render passes to highlight objects,

http://answers.unity3d.com/questions/955897/unity-5-can-you-make-a-toon-shader-using-the-stand.html

I also hope the development resumes, I use a lot of Freestyle in my work.
These are the few links where you will find news about freestyle.


https://blendernpr.org/

If only it could use opencl or be multithreaded. That would be great.

This site should be of interest to you:

https://blendernpr.org/

especially the postst about BEER.

Thanks pafurijaz & BrilliantApe for the links.

A very promising commit has appeared that bodes well for the future of NPR in 2.8. It seems that freestyle may well be a realtime option with the clay/eevee engines in the future! To the OP, I also use C4D and Sketch/Toon, but having freestyle realtime with eevee would give it some serious competition for NPR work :RocknRoll:

Freestyle is back for 2.8

It works with Cycles and Blender Internal for now.

The Blender Internal support will disappear, but we should be able to
integrate it with the Draw manager and use it with Clay, Eevee, …

https://git.blender.org/gitweb/gitweb.cgi/blender.git/commit/c2191912cabeab3dd0da12863a8af8e57a70df03

OMG thats huge news, although I wish the freestyle options were a bit more interchangeable within Blender. As it is its a faux render pass that you cannot interact with in post.

I would love to see NPR shaders for the viewport.
Wouldnt GLSL shaders be able to replace Freestyle completely? and if not, what features would be missing?

Wouldnt GLSL shaders be able to replace Freestyle completely? and if not, what features would be missing?

Freestyle actually draws real lines, GLSL shaders can’t really do that, at least not on their own. They’re just one part of the pipeline. You can create the appearance of lines e.g. through edge detection in a post-process, or from the normals in shader, but that’s more limited.

You could write a fancy line renderer with OpenGL too. That basically means writing an entire new renderer, though.

Wouldnt the ‘fancy line renderer’ be the Eevee engine? it renders the outline and wireframe shaders, so it should also be able to render other glsl shaders (seeing as the wireframe shader and such are glsl shader files)

Shader-based wireframes aren’t “real” lines, they are the result of rendering the fragments at the edges of the triangles differently. For instance, you couldn’t just make them “scribbly” (at least not without modifying the input geometry), because you are confined to the extents that the rasterized triangle covers. Also, the input is a triangle, so loose edges become a special case. For actual lines, OpenGL has a different primitive, which just fills (as you may guess) a line between two points.

Again, a big part of the pipeline is not actually controlled by shaders - the input primitives being one of them. With modern features like geometry or compute shaders, it is possible to also generate geometry within shaders, but this still requires a framework of non-shader code to support it.

Afaik freestyle also doesnt change the input primitives, but is still able to create its lines based on the available information (normals, depth, material/object ID’s, etc) the same information is available to shaders.

You said that it wouldnt be possible to create scribbly lines/outlines without changing the input geometry, but please see the following link for a shader that does exactly that. (if i understood you correctly)

The following link is a nice example of hidden lines rendering;

Freestyle is a software renderer, it can do whatever it wants. When I’m talking about “input primitives”, I’m talking about the OpenGL pipeline, not the abstract concept. The OpenGL pipeline puts constrains on what can and can’t be done in a shader. There at least two different shader types, vertex and fragment shader. Only a vertex shader can displace geometry, only a fragment shader gets to output the final color for a single pixel (but not neighboring ones). Neither can create primitives. When laymen talk about GLSL shaders, they’re most likely talking about fragment shaders.

Depending on the hardware, there also are geometry shaders which can create a limited amount of geometry (though almost nobody uses them, because they are inefficient). There’s tesselation control shaders, which control the subdivision rate and tesselation evaluation shaders which control the position of the resulting vertices. There’s also compute shaders, which work outside of the pipeline and process arbitrary data.

All of this stuff needs to be controlled by the application code, it’s not just about the shaders. To do something like freestyle, you would need to render multiple passes. You need to write a whole renderer, not just a shader.

You said that it wouldnt be possible to create scribbly lines/outlines without changing the input geometry, but please see the following link for a shader that does exactly that. (if i understood you correctly)

Everything on shadertoy doesn’t actually work with arbitrary 3D meshes, it’s just a fragment shader that is run on a full-screen quad. Most of that stuff is distance-field rendering. The scenes are “typed in”. You can do some cool stuff there, but it’s not generally applicable.

You can technically write an entire raytracer just using fragment shaders (it has been done), but then you’re not using the GPU’s hardware geometry processing at all and we’re talking about entirely different levels of performance (i.e. more like Cycles).

EDIT: looks like that scribbly shader is multiple shader passes with a edge-detecting post-process. This can work on arbitrary geometry, but the renderer framework needs to support this (i.e. passing through all the required data). Again, these are not “real” line strokes (from a specified point A to B), but distorted edge pixels.

Alrighty, that makes things somewhat clear

I guess my idea of the matter was too simple. :slight_smile: