Non-photorealistic rendering script/plugin idea

Alright, here’s the basic idea. I’m working on a project to attempt to make my blender renders look sketchy and drawn (a stark contrast to the recent raytrace-induced trend toward realism). Basically, I’m going for a pencil sketch kind of look similar to this image:

http://misaligned.net/gallery/traditional/2003_self_7yo.jpg

Getting the model to look right isn’t so much of an issue (although I’d like a better means of adding edges than my current hackish solution). My issue is trying to get that nice rough pencil sketch border around the model. Granted, I could do it in post, but for animations, that can get extremely tedious. Some attempts I’ve made a mimicking this look have included trying particles, shadow-only lights with a plane behind the model and repeated loud cursing. Particles don’t give me the nice linear tangential edges and shadow-only lights didn’t help much at all. Cursing at my monitor was the least successful thing ;). I’ve also rendered the character with the background alpha’d out and toyed with the GIMPressionist plugin in the GIMP. The effect there is alright, but it’s a bit too regular and constrained to the model’s outline… and it doesn’t have the nice tangential lines either.

That said, I think I may have a solution, but I’m not sure where I should start as far as implementation goes. Essentially, the algorithm (roughly) goes as such:

  1. Render scene with an alpha channel (should be crisp; no blurry gradient stuff, alpha is either 0 or 255)
  2. Pick points at regular intervals along the edge of the alpha channel and calculate the equation for a line tangential to the alpha mask at that point.
  3. At each point draw a line of predetermined length along the proper tangent for that point.
  4. For each line, extrapolate more lines with randomly varying lengths and intensities to an average (but randomly varying per line) distance away from the model. Use the alpha channel as a mask to prevent these lines from drawing over the model.

That should basically be it. I’m sure I missed something and there’s definitely room for improvement, but that’s the idea. I’m not afraid to try and code it myself, but I’m a little unsure of where to start. This seems like a strong candidate for a Sequencer plugin, but I’m not sure if this sort of thing is capable with the Sequencer plugin programming interface. I’m assuming that since plugins need to be written in C, that the heavy lifting is done there and that the Sequencer interface is merely a means of getting and recieving frames to and from the C code. If that’s the case, would I be better off trying to script this for the GIMP with Script-Fu (or Python-Fu, or Perl-Fu… you get the idea) since there’s a little bit strong pre-built functionality for drawing on 2D images in that?

I’d really like to try my hand at cranking this out, but I think I’m in need of a bit of direction from someone more familiar with this sort of thing. All advice, suggestions, and assistance will be greatly appreciated.

actually there are a bunch of ways to get the edges of an image

so, there is the depth buffer, which will do good outlines and some self-outlines (like if from the camera’s point of view the nose would cast a shadow on the face behind it)
some crazy thing based on changes in normals (I think you would have to find a way to do this yourself
maybe just an edge detect filter based on the rendered image? so you can get outlines on the shadows too

umm, some images from an ati demo
http://home.earthlink.net/~nwinters99/screenshots/edge_generators.jpg
http://home.earthlink.net/~nwinters99/screenshots/shadow_edge_generators.jpg
http://home.earthlink.net/~nwinters99/screenshots/end_result_hatching.jpg

this was done in real time, but some relatively advanced techniques that will not be trivial given current access to blender’s renderer

I think you ought to be able to do pretty good if you render to a sequence of images and their depth buffer, and split that up later.

you could probably script a lot of it using the gimp, all but maybe seperating the image from the depth buffer when you render to iriz+zbuffer (I don’t think the gimp can open that)

Hrm… now that I think about it, I wonder if it’s possible to simply extend Blender’s Edge settings to accomodate for this level of random sketchy-ness. As they currently work, they aren’t that great for toon ink, but it might provide a decent starting point for what I’m looking for. Of course, that means I have to tinker with the blender source rather than do my own plugin… Oh well, I’ve been meaning to dig through the source anyway.

/me goes to check-out bf-blender from cvs.

SLiM?
https://blenderartists.org/forum/viewtopic.php?t=14581&highlight=?

I actually forgot about that script! Thanks for the reminder, Oyster. I’ll go check that out as well. I haven’t heard anything about that script (or seen many posts by flippyneck) since Sept, but here’s to hoping that he’s still working on it (and that I can adapt it to my needs).

Fweeb, it’s nice to hear from someone else interested in NPR.

I’m afraid that I’ve done no more work on SLiM since September as you say. I keep meaning to go back to it but I’m no natural at maths and programing so it’s heavy going for me. As you’ll know if you’ve downloaded SLiM my approach was to locate mesh edges in the scene and create ‘lines’ as geometry. It works fairly well, but my big stumbling block was that I couldn’t find a way to calculate which points on the mesh edges were adjacent so could only render unconnected lines instead of combining them into curves. Fine for technical drawings but not much good for nice organic models. Also to extend SLiM work as I really wanted to would have meant writing a scanline/z-buffer renderer in python which would have been unusably slow and a pointless duplication of Blender’s existing renderer.

If you are up to tackling the Blender Source then I think that this is the approach used by commercial renderers such as this one http://www.finalrender.com/products/products.asp?UD=10-7888-35-788&PID=37 . I would absolutely love to see something like this in Blender and I’m sure plenty of others would too.

1 - Locate the edges of the geometry in much the same way as SLiM does.
2 - Project these edges onto the image plane
3 - Render the edges in a 2D vector format which allows the line settings to be tweaked without re-rendering the entire scene. (I think SVG could be used for this?)
4 - Comp the final line image over the rendered image.

Alternatively I think that a sequence plugin could also be written that works on the alpha/z buffers as you described. This would be easier but not as versatile.

You might also want to check out Hiroshi Saito’s site:

http://www.dims.or.jp/blender/gallery/index.html

Take a look at the ‘Beyond the border’ images. It took me ages to work ot how he did this, but I got it in the end. He uses dupliverts to parent ‘edges’ to the mesh, scales them in the space of a single frame and then renders with motion blur. Not quite the control of a scripted solution, but still looks pretty good.

I think I’ve rambled enough now. Keep us posted on any progress.

'Til next time,
Flippyneck

this would rock so much…

Well, it’s possible, and I’m working on it. It turns out that the simplest way to do this will likely be to write a GIMP plugin, as it already has drawing routines and path-tracing routines built into it. The bonus, though, is that I can write the script in Python-fu… so it may be eventually portable to Blender at some point… maybe.

There is one snag, though. The API call I need in the GIMP’s scripting interface isn’t implemented in GIMP 2.0 yet… so I have to look into the GIMP source to try to get that to work first. I’ll try to keep you guys posted with this as progress gets made. In the meantime, if anyone has any decent references to code and math for tracing along a bezier curve, please let me know :).

A bit different fx, but here’s my pen shader… :wink:
https://blenderartists.org/forum/viewtopic.php?t=22072&highlight=

Wow! Really nice. I hadn’t seen that topic. Thanks for pointing me to it. That’s some great stuff.

Alright, I put some work into this. Implementing the necessary scripting API call in the GIMP is a little bit beyond my time and capabilities right now, so I have an interim solution. First, here’s the model I made to replicate the above drawing:

http://misaligned.net/gallery/3d/2004_self_orig.jpg

And this is what it looks like after the script is run:

http://misaligned.net/gallery/3d/2004_self.jpg

And then, as a bonus, I’ve created a 360-degree spin animation of the figure to see how it looks with animation. I still have to play with the GIMP Animation Package so that I don’t have to run the script on each frame by hand (that or play with GIMP-Perl)… but here you go:

http://misaligned.net/gallery/anim/self_spin.avi - 892 kB (let it loop, it’s fun!)

Fweeb,
This looks very promising, two crits, if I may:

  • The body should look like it was drown with small pencil strokes, currently it is dotty
  • The animation presents another problem: the image is too random, dots are showing up and then dissappear without a reason (if a human creates animation it doesn’t look so distorted)

but all in all, VERY promising

some off-topic
21 Apr 2004 Liquid+ Released NOW!
NPR (Non Photo Realistic) renderer Liquid+ for 3ds max is released now. Liquid+ realizes the hand-paint watercolor expressions with bleed and jitter images on 3DCG animation.
And Free Demo is also available for your trial.

http://www.psoft.co.jp/visual/en/
http://61.187.55.84/images/upload/2004/04/23/16580597.jpg

http://61.187.55.84/images/upload/2004/04/23/98012030.jpg

http://61.187.55.84/images/upload/2004/04/23/21920962.jpg

[/img]

Thanks! I’d like to have the pencil-stroke look, but without a lot more work (or a scripting interface to GIMPressionist ;)), it’s gonna take a while. As for the random black dots… I’m kind of partial to them… but you’re right. It’s not a hand-drawn look… more like bad photocopying. I’ve changed the edge-detection in the script and the results are a lot cleaner… maybe too clean. I’ll post some images (and maybe another animation) later today.

Oyster… nice, um… advertisement.

I think npr in blender is a very important subject that is important to have a website arounde this

Indeed. Maybe I’ll add a section to my website on that. flippyneck and Enriq776 have done good work on toon-based npr. In fact, I think they have sites already.

– edited to note that only the modeling and initial rendering was done in blender, the sketchify was done in post with a GIMP script that I wrote –

zygom wrote:
I think npr in blender is a very important subject that is important to have a website arounde this

Indeed. Maybe I’ll add a section to my website on that. flippyneck and Enriq776 have done good work on toon-based npr. In fact, I think they have sites already.

NPR is an important area. It’s not just about toons either; really good options for NPR renderering would be a perfect complement to things like the BlenderCAD project. I am in the process of gathering together all of the available info on NPR in Blender and writing a tute/article. When it’s done I’ll post it on my site.

Fweeb, how easy is it to write Gimp plugins like yours? I’ve written a few halftoning scripts using PIL for Python but they are exceptionally slow so a speedier platform would be interesting…

GIMP scripts are almost stupid-easy :). They can be written in a variety of languages (mine is in python), and the api is available for browsing through the gimp interface. If you have a copy of GIMP, just look for the *.py files in the plugins directory. They’re pretty easy to dissect, and they give you a good idea of the necessary structure. I believe that the GIMP website has some information on scripting as well (although not so much with python, mostly with perl and scheme).