Blender & Zbrush

I’ve been testing out importing/exporting obj files back and forth between Blender & Zbrush, with my main reason for doing this to fix UV mapping problems, redo’s & tweaks. What I’ve found is that sometimes it works, sometimes it doesn’t. It doesn’t work when you create any geometry in Zbrush, such as adding edge loops to your mesh. If you don’t add any edge loops, or do not create a mesh from Zspheres, it works fine.
I’d really like to get any input and help on exactly why it’s not working under some circumstances if anyone knows or has any ideas at all as to why.

Here are my notes:

I’m using Zbrush2, and Blender 2.42RC1

what works:
Scenario #1

-import an existing obj file into zbrush(mine were created in blender)
-work on mesh, have many subd’s, lots of detail using proj. master, move vertices around, etc.-but do not add edge loops.
-export obj at subD1 (having options obj, txr, qud, mrg checked)
-import obj into blender, make uv changes
-export obj from blender
-import obj into Zbrush at subD 1.
Model works just fine, all subd’s are intact, and new UV’s are working as well!

what works:
Scenario #2

-work on Zbrush model, doing ANYTHING to it(including edgeloops)
-export model as .dxf from zbrush
-import into blender, fix UV’s.
-export from blender as obj
-import into zbrush as obj
Model works! All subD’s intact and uv’s have been fixed!

what works:
Scenario #3

-create a mesh in Zbrush from a sphere 3d, or one of the primitives, making them a polymesh.
-adding detail, subD’s, anything but edgeloops.
-import into blender, fix UV’s.
-export from blender as obj
-import into zbrush as obj
Again, it all works!

What doesn’t work:

-work on imported mesh in Zbrush, creating edge loops. OR creating a mesh from Zspheres(then using adaptive skin to make it a real mesh).
–tried making the subd1 a polymesh, doesn’t work either
-export as obj(same settings as above)
-import obj into blender, reUV map.
-export obj from blender
-import into Zbrush…mesh is all messed up when I change subD lvl’s. Looks like vertices are just scrambled up.

So it seems like something is messed up in the area of exporting an OBJ from Zbrush and importing it into Blender, but only if edgeloops are added to the mesh or the model is created from Zspheres. Maybe some Zbrush data that the obj format or Blender is misinterpretting?

Attached is an image showing my blender/zbursh import/export settings.

Of course, there are a ton of variables in this issue, and I don’t have time to test them all. But if anyone wants to add their testing notes to this I’m sure it would be helpful! :slight_smile: I think this would be a good thread to talk about each other’s experiences and results in Blender & Zbrush.

Attachments


I made a model in Zbrush, and wanted to test out the normal maps created in Zbrush, rendered in Blender.

This is the procedure I did:
Scenario #2
-work on Zbrush model, doing ANYTHING to it(including edgeloops)
-export model as .dxf from zbrush
-import into blender, fix UV’s.
-export from blender as obj
-import into zbrush as obj
THEN-generated normal maps in Zbrush & applied to model in blender.

And here is my result attached in the image! The low poly mesh has 1152 triangles. The high poly Zbrush model has ~600k.

Note1: My UVmapping isn’t finished here. I just did it quickly as I wanted to do this test. There are actually more details in his beard, and his ears should look better.

Note2: I am a newbie when it comes to materials, lighting, and rendering in Blender, or in any app for that matter :stuck_out_tongue: I’m just starting to learn those things so this render isn’t so hot.

Attachments



You bought ZBrush?

It’s pretty cool what can be done with normal maps. Your effort really added a lot of detail to that head.

It’s that kind of attitude that poisions many people’s view of open source software. Seriously. Just because someone uses Blender doesn’t mean that person is totally bankrupt. Maybe some people think Blender is better than other software… ?

And why is that your business?

annoyed

Yes I bought Zbrush. I also bought Adobe Photoshop :eek:

Testing Part 2:

Anyway…here’s my next test after a few hours of playing around with the material/lights/render settings, re-uvmapping the mesh, and creating new normal maps.

As you can see, I didn’t do the UV mapping for the normal maps correctly as there are seams on the rendered model(you can see it on the top of the head, they’re also in other places on the model). Well at least I’m learning! Now I gotta try and dig up some normal mapping tutorials(what I should have done in the first place :p) But I think the normal maps are dependant on having the mesh hooked up with itself as much as possible, and I had many areas here that are not hooked up.

Below are examples of the blender low poly normal mapped render, the high poly model in zbrush,a nd the normal map i made in zbrush(tweaked to increase the contrast to get more depth when rendered).

Attachments




looks good, any idea where decent tutorials are for zbrush, as the community is not as good as blender, there seem to be more fee based tutorials than free ones.

Thanks mandoragon :slight_smile:
Yeah their forum community isn’t that great…however they do have a ton of documention/tutorials. Download the practical guide, and:
http://www.pixologic.com/zbrush/home/home.html
Just click on the wiki link on their main site. Also in the forums you’ll find posts with some really excellent video tutorials if you search around. There’s a ton of free ones, I never had a need…or desire, to buy a tutorial.

Here’s a great article on normal maps too:
http://www.bencloward.com/tutorials_normal_maps9.shtml

Test #3

fixed the UV mapping problems. I used Zmapper in Zbrush this time, it has a ton more options, and also a…UV seam fixer! got rid of those bad seams :smiley: I also discovered that the reason why I was getting some bad results was because I had separated and rotated parts of the UV’s, and I think they need to all be facing their correct direction so that the “light” from the normal maps is generated properly. If you look at the braids in his beard in my post above, you’ll see that one looks like the lights are inversed-I think that’s why.

Anyway…my new render here has both the new normal map and also a cavity map applied to it at a low opacity. Just to give some darker shadows on the low points of the mesh to make it look prettier :).

A few issues with the normal map remain, like that dark spoint at the base of his beard, but i’m not too worried about it. My final goal isn’t to make renders like these, but to have a fully textured color map on a low polygon mesh.

I used the lowest lvl poly mesh on the left hand side, that is my base model that i did in blender, and then the subD lvl 2 mesh, that was made in Zbrush(I had 6subD’s in total).

Attachments


a solution for making normal maps is to use xnormal http://www.santyesprogramadorynografista.net/projects.aspx
it works for sharpconstruct as well

Is tangent normal mapping supported in Blender?

Great, great stuff.

Have you tried playing with displacement maps generated from Zbrush yet? That is what I want it for. Until Blender gets the tangent space normal mapping mode. Then they will likely be better to use.

Also, do you know if the UV tiles option in Zbrush is supported through Blender? If it is not, then I guess you have to unwrap in Blender and then export to Zbrush. But that might not be a bad thing anyway.

Thanks for sharing.

BgDM

GUV tiles is supported by blender. That is how the house is uv mapped in my Journey Begins entry. Object space normals seem to work really good. I haven’t tested out tangent based. Object normals seem to blow out your lighting though. Lighting acts kind of strange. Highlights and shadows don’t fall where they should.

Not only the tangent space,but even the object space normal mapping is wrong in Blender.
Your object don’t look good(it look a bit strange),it seems a little bit flat(not your fault,but it’s a Blender related problem,normal map on curved object look good only when the light position an view position are very similar).
Btw,good work,if the Blender foundation understands what can be done with normal mapping maybe they fix the problem.

Hi thanks guys :slight_smile:

Testing part 3:

Zbrush displacements & bump map
Test render in Blender with displacement

Have you tried playing with displacement maps generated from Zbrush yet?

Yes, and a rendered test shot in Zbrush of displacements is below!

I tried to apply a displacement map to a mesh in Blender but got poor results…I had disp checked, and used a higher subserf lvl on the mesh for render time…but I think I’m not doing something right. Can someone tell me or point me in the direction of a tutorial to do displacement maps in Blender?
I also posted a pic of the result here, as you can see, the detail is softened by a lot.

Also, do you know if the UV tiles option in Zbrush is supported through Blender? If it is not, then I guess you have to unwrap in Blender and then export to Zbrush. But that might not be a bad thing anyway.

I’m not sure exactly what you mean by your question. UV’s are in your mesh file that you export from blender into Zbrush, or from Zbrush into blender as an OBJ. They don’t change unless you change them, so they’re always supported.
Zbrush doesn’t have real UV mapping editing tools however, so you have to do that in an external app.

Test comparison render of normal mapped & plain object, and correct viewing of normal maps in Zbrush

Not only the tangent space,but even the object space normal mapping is wrong in Blender.

You know at first I saw that normal maps worked in Blender, and was wondering why people kept saying that :slight_smile: Then I did another little test. 2 meshes, exactly the same with the same materials and lighting rendered. Only difference being one has a normal map. Look at the difference!
Yeah…Blender does still need to get full normal map support.

Attachments





Sorry about posting on an old thread, but I saw the finished product in your CGPortfolio, and it’s great!
Here the picture is:
http://roja.cgsociety.org/gallery/373922/

ah, hehe, thanks :slight_smile: That’s the Zbrush model/render of course. I’m also working on making a low poly, full body, fully rigged version done in blender. Almost done with it.

Anyway I do plan on finishing my testing on this guy once Blender gets real normal mapping support!

Wow, nice work! It’s good to see another fellow ZBrush user here, Welcome to Elysiun/BlenderArtists btw (if you havent been welcomed yet)

Can’t wait till you do that. :wink:

Well, there really is no tutorial for it. But, to acheive decent results, you can try the following:

a) Set your render Sub-D level to a minimum of 4. Anything higher takes a long time to render. Though I have set it to 5 before.
b) Assign your Displacement texture and set the DISP slider in the material options to about .75, as a starting point. Then, just keep playing with the slider to get your desired results.

Blender currently does not support 32 bit displacement maps, afaik. Thus, you don’t get the detail level that you would notmally, in another renderer. This is coming though, with the render recode stuff, along with sub-pixel displacement, to my knowledge.

I have been playing with Zbrush learning edition, and I have to say, that it is indeed worth every penny to get the application. Just wish I had a few of them right now.

BgDM

It does now, all Blender textures can use float colour. You’ll have to use OpenEXR files as textures - Cinepaint and I think Photoshop (with the plugin) can convert to them. Sub-pixel displacement would definitely be nice, but as far as I know, it’s not on the horizon, or in any kind of plan at the present.

Yes,I can confirm,Blender support 32 bit displacement,with openexr and hdr file format,some time ago I made some test.The problem with this kind of textures is that they are huge,really huge(try a 8191(the maximun dimension for a texture in Blender) 32 bit textures and see how much memory eats).If we can use normal map for high frequency detail,sub-pixel displacement is not so important.