Discuss LightWave workflows and LightWave/Blender comparisions here

I am not sure how long lightwaves edge rendering system has supported raycasting. It may be related to the new renderer, but I don’t think so. Freestyle has its own advantages of course such as eevee support. And to be fair I haven’t put freestyle through it’s paces, but the impression I get is that freestyle provides more distinct edge types and some built in functions that are quite nice but lw provides deeper control in its nodal system. That is generally the impression I get between the two programs overall actually.

1 Like

I haven’t had the need for vector edge export yet, but I can imagine that for many uses it is basically a requirement.

Yeah I was referring to some of the demos Lino did with the new render er showcasing the anime stuff.

“Edge rendering is another fundamental feature important for getting the proper NPR Cel Shaded look.
We have also vastly improved edge detection and tracing in the next LightWave version to give you more precision and control over edges than ever before.”


Aside from that, I really have not played with freestyle at all.

But I am pretty sure grease pencil takes this to a whole other level in Blender. It is kind of 2.5 D and you can animate it.

Grease pencil is really intriguing, and completely unique. One of these days I plan to really to dive in to it. One of these days…

1 Like

Just a note reminder till have I time to record it perhaps.
Was playing around with drawing hair in Houdini, exporting the guides as polychains.
Now… to apply hair on to mesh polys directly with particle hair, you need the hairnet plugin (not sure of status for blender 2.8)

I have used that for converting Lightwave made polychains to hair in blender, but you do need to set it up and correct position scale etc and then select a mesh cap and polystrands to make it work, a bit tedious, In lightwave I just have to add fiberfx directly on the strands… and it´s Done, no mess with additional plugin there.

That said, here´s on the other hand the issues with Lightwave (except for the slow render)
Importing the hairstrands directly from houdini obj, will cause lightwave to close the polystrand ends(not good) Blender on the other hand imports it just fine with open strands, so I simply save it out as obj again, and then I can load that obj in to lightwave for fiberfx …if I feel for it.

So lightwave, plus for direct fiberfx on to the strands, and minus for not importing hairstrand obj from houdini properly.
So Blender plus for importing hairstrand obj correctly directly, and a minus for not being able to render particle hair directly without converting with hairnet.

You also need to invert any fiberFX fiberV strand thickness in Lightwave, and you need to switch that thickness the other way around in blender as well.

Now this is tested with houdini apprentice, and Lightwave 2019, no crashes…it has been a bit more stable than previously in that regards.
And blender 2.81 experimental.

for Lightwave you could probably avoid the obj process though, if buying OD tools for that to get houdini hair in properly, haven´t tried it though…works good enough to route it the way I do with passing it through blender for saving the obj properly.

1 Like

An interesting development at Unreal

I know it can work with xgen, and others but I wonder about using Alembic from other apps.

Our schema allows the transfer of attributes such as “width” and “color” into Unreal Engine, along with “guide” attributes that are identified for the simulation of interpolated hairs. Multiple hair groups within a single alembic are supported via “group_id.” We partnered with the talented teams behind beloved hair grooming applications Ornatrix and Yeti to provide built-in support and each offers the ability to natively export to our Alembic protocol into Unreal Engine 4.24. Guidelines for using XGEN with our alembic schema can be found in the documentation.