So I’ve posted on here before, and was super happy with the helpful feedback, so I thought why not try again! Basically, I’m going through the process of making a game asset for Unity (a water cooler) and I’m currently making my Normal Map. Mind taking a look and telling me if I’m either going horribly wrong or in the right direction? I’ve tried to avoid and fix all obvious errors, but I’m still new so might be missing slightly less obvious things… I’ve attached an image of it below and my .blend file (Blender File) below!
Edit: Here is what it looks like when applied, I’m getting weird shadows around the model (seen on the right side of the image, at the cup dispenser)
Sorry for coming off as ignorant, but what exactly are you worrying about?
From here your Normal Map does look rather normal (no pun intended).
From experience when exporting into Unity you will have to adjust the level of normal map intensity (for substance painter normal map textures for example I always have to half the output) … depending on what shader it is which you are using along with your PBR textures.
Maybe you want to share a photo of the model being rendered in Unity and what it looks like in Blender, then we might be able to tell you whether something seriously needs attention or not.
No that’s fair enough haha. I’m just worried that I’m majorly messing something up! When applying it in Blender to test it, I get these weird shadows (as seen in the screenshot below on the cup dispenser) and I can’t seem to fix those… And those show up in a few places…
…oh, got’cha… I think what happens here is that you are baking your normals onto the subdivided mesh’s UV map and not to your low poly mesh’s UV map.
edit: …basically your low poly mesh uses the high poly mesh’s UV map as a result… if that makes more sense
Okay… So my high poly doesn’t have UV’s so that’s strange
Just to check, my low-poly has UV’s on and to bake I select the high poly first, then the low poly with UV’s and a material (with new image created) and bake (obviously adjusting distances as and when needed). Does that sound about right?
Edit: You’ve already been such an amazing help, but maybe you could take a look at the .blend file if you have time? (Blender File) If not, that’s more than alright and I’d love to keep talking through the problem!
…so I gave it a shot, and of course, you are doing it all the right way… the only thing is the way which the high poly geometry gets casted onto the low poly mesh. Certain areas can not transfer the way one would think… for example, you can not project a sphere onto a cube without getting island distortion. So the way I was able to fix the distortion in creating a sharp edge on the low poly mesh, which emulates a clean transition on the UV island… see image (please ignore the low raycast artifact on the brackets, this was just a test):
It’s just an idea, but this has solved it for me:
Create a sharp edge for geometry that is rather boxy or sharp angled on your low poly mesh and then bake everything again.
On the image you can see the comparison between the non-sharp-edge and the sharp edge. you can remove the sharp edge after baking is done , if you wish… but usually you would want to keep it… for cleaner shading looks
I hope this gets you going.
This definitely helped! Couldn’t use it everywhere unfortunately… But it definitely was the start of things working, so thank you!