Can freestyle do this type of cross hatching?

This is the best cross hatching thing I’ve ever seen. This is what I’ve been wanting out of a CG program for ever. My mind flips out when I see this. Can this be done in Blender? With Freestyles?
holy cow this is exactly what I want to see!

basicaly it can, but you’d need to mark all edges you want te be drawn. maybe this is possible in the future, but not right now.

Not this exact technique because it appears that this bakes the cross-hatching right to the models’ image textures. Freestyle doesn’t work that way.

What I’m wondering is… Does the cross hatching respond to light movement the way this does? None of the examples on the Freestyle site are this complicated.

freestyle is not a shading system in any way and isn’t wanting to be. It can do a lot of things a shading system can’t because of this.

Freestyle creates lines for different edge types ( borders, edges, creases, silhouettes, edge intersections in the future, etc. )

what this video shows looks like some sort of non-photo-realistic shading system.

I personally don’t like the way it looks. To me it doesn’t look like crosshatching as much as crosshatch style texture/material generation, which looks very much like textures on 3d objects. Regardless it would be cool it would be cool if one of the renderers could do this kind of thing.

I think blender could benefit from a cool/robust non-photorealistic shading system that works with cycles. I think 3DSmax has it’s nitrous rendering thing that looks pretty cool. You might want to check that out to see if there are crosshatching style possibilities. http://www.youtube.com/watch?v=0wYU4kaZ8UM

So no freestyle can’t be used for this but its not because it’s any less complicated or capable. It’s like asking if a dentist can fix your anus sores.

I’m not really asking SHOULD Freestyles do this. just wondering if Freestyles is my answer or if I need to keep looking.
I found this paper: http://gfx.cs.princeton.edu/proj/hatching/hatching.pdf which seems like it is what I see in this video. Man I want to see this implemented. My cousin just teased me saying he could write a plugin to do it. We’ll see if anything happens with that. I know he contributed something big on GIMP, so maybe there is something there.

definitely need to keep looking for a crosshatch shading solution. I also wish blender had good NPR rendering/shading options. Luckily freestyle is the best stylized edge line generator i’ve ever used.

Yes, Freestyle makes great edges. no doubt.
I hope my cousin tinkers with this idea implementation. I know he could do it, but I don’t know if he has time/ interest/ incentive.

I have been working on NPR shaders for a while now, and this hatch one is doable via material nodes, probably even tweakable live in GLSL mode. I will look into it too, but here is one I have cooked up recently:

I saw a similar presentation two years ago, I think, where they used a hatching texture as a matcap and applied it to their models via some projection mode (might have been normal but I’m not sure) but this looks like they are using light to swap between two or more different textures, then it should be possible to recreate in BI where you can use a spotlamp to blend two textures (tutorial here) or even by using a lightpath node in Cycles perhaps.

Actually I became interested in the subject now that you got me started on the idea so I decided to make a little test animation in Blender myself trying out this crosshatching technique, pretty decent for the first try I think.
- BLEND FILE -

It looks like you could definitely do that with Blender’s shader nodes. It’s just a matter of using the lighting as a blending factor for the textures.

Another method-- Involves UV unwrapping as a guide to the contouring line direction. It can get a bit tricky i think, this was pretty quick and dirty:

http://www.jikz.net/ImageDump/Suz_Hatched.png

I don’t like that it is too repetitive, but the theory is shown well enough I think.

And the .blend–one packed image for the hatching.
http://www.pasteall.org/blend/11089

Question: How to make this 100% procedural with internal textures while maintaining GLSL capability? It will save someone A LOT of time if they are composing large scenes. For some reason I cannot get internal textures to work in GLSL I am confused, any tips? Thanks for re-firing me up about this style!

One other thing worth noting as +ikeahloe pointed out, this has potential to looking simply like pasted textures on objects. The real trick might be having similar controls on contour and direction, but also hatching interacting between objects as they would in a normal drawing. Another suggestion is using a similar node setup in composite nodes rather than material nodes as in my little test above. I wish I could code! LOL This would be nice to have better NPR integrated, there seems to be a great use for it!

Ow wow, I should have subscribed to my own thread. duh.
Glad you guys are interested. I’m excited to check out the blends. I like Jikz image, but as usuall, it looks like polygons.

As far as I can see, the curvature of surfaces is the problem. That paper says it’s easy for well parameterized surfaces.

My cousin is still very interested, says he gets the math and should be able to implement it in Blender. He is intellectually and aesthetically interested in the project and likes building on Open Source projects philosophically, so I have really high hopes right now that we will get this working in Blender. Of course he is busy and in demand. But he has my scene to start testing with.

We’re both pretty sure that paper I posted describes the technique seen in that video. It was an interesting read. As far as I understood, the strokes are procedurally generated, so the strokes aren’t some prehatched UV mapping. They don’t use the UVs (but I think could as a directional indicator if desired) instead the apply in these overlapping patches. Then there are like 6 densities of procedural hatchings that layer over each other and then adjust according to lighting. The beauty of it is, as the density of hatch thins, some lines disappear and others stay, which allow continuity of the hatching from frame to frame. it really is as if there is always something hatching strokes in and out.

I’m mostly interested in using these as a bridge between my 2D and 3D art.

My concern about that paper is that it is only discussing Real Time. I can’t make use of GLSL on my cheap ass mac mini, and I’m after print resolution.
I’m really hoping it can be implemented for non real time uses.

Just looked at Nitrous in 3DS that Ikeahlo linked to.
that stuff just looks like photoshop 2D filter post process. That stuff is easy and nice, but nothing like my cross hatching. That video actually hatches like I do by hand.

To get something resembling what was in that video (better actually) you’d need to calculate a cross field from the geometry, which is pretty complex and I’m not sure if it’s viable to do in a shader. You can read about it in Illustrating Smooth Surfaces by Hertzmann and Zorin. On a side note, if someone can implement that we’d also be much closer to implementing a good retopology modifier.

Thanks for that. Reading ASAP!

Man, you math guys who can think this stuff up are awesome!

http://obaa.fr/en/blender-pour-les-architectes/tutoriels/tutoriel-rendu-type-crayon-avec-blender-partie-2/

here you can find a technique used in blender 2.43, that, 'm sure you can enhance with freestyle and new capabilities of blender, but it gives good information on wath can be done with blender node system for NPR- rendering

hope it’s help