Advanced shading / texturing / lighting (Blender - Unity)

Hello everyone!
Does anyone know in which cases Cycles baking should be preferred over
Unity’s Beast lightmapping and vice versa?

Many thanks in advance,
Greetings,
Shu

Assuming you’re attempting to build a scene in Unity (I don’t know why you’d use Unity to build something for Blender)…

If I were to use any sort of Blender baking for lightmapping a scene, I would have to rebuild the scene and replicate the lighting, which is a ton of work, which pretty much precludes the use of Blender for lightmapping a Unity scene. However, the Unity/Beast combo has limited options, if I recall correctly, and it’s difficult to get shadows to work properly. If there were no extra work involved, I would stay away from Unity/Beast because it’s a pain.

For baking an individual object, I use Blender. If I’m baking an entire scene, I use Beast.

Thank you for your answer, KnightsFan!
Yes, I do want to build scenes in Blender and export them to Unity.

I actually thought one could export the lights that have been used in Blender for lightmapping
to Unity and use them for moving objects there.

code.blender.org says:
“Exporting whole levels” works now when trying to export as .fbx to Unity:
http://code.blender.org/index.php/2014/06/supporting-game-developers-with-blender-2-71/

It could be pretty handy if that works - especially if Beast doesn’t work well.
I don’t know how well it works, though. I haven’t tried it yet.

Also, I don’t know how long the baking process takes in Blender, but I guess it could take some time overall
when you always have to bake objects separately. Or is there a way to bake an entire scene in Blender as well?

I would recommend against building an entire scene in one Blender file. It’s much easier (in my opinion) to build individual parts, and then position them within Unity. I usually have a sort of main .blend file with the big geometry, like buildings, and then have a individual prefabs for models that are instanced many times, like streetlamps. If you’ve worked with UDK, it’s a little like making your BSP cuts in one main .blend, and then creating props and stuff in other .blend files, then putting it together in Unity. I find this easier because it allows me to easily change one small thing without re-importing the entire map and either re-adding the proper components or writing an Editor script to do it for you.

As far as I know, you can’t get Blender lights into Unity, unfortunately.

When I say Beast doesn’t work well, I mean that it doesn’t work the way I’d expect, and back when I learned it the documentation was terrible, but even then it was usable. It just takes some trial and error. Also keep in mind that if you bake with Blender, you will have to find a way to get those lightmaps into Unity. You could either mix the baked textures onto the diffuse map (eg with Photoshop), write your own shaders that mix the maps, or figure some hack to manually add lightmaps to Unity which I hear is difficult.

Really there’s no situation I can think of in which I would recommend using something other than Beast for Unity’s lightmaps. I still use Blender for baking AO on individual objects, though, but that’s just part of the texturing process.

Thank you for your advice, KnightsFan!

When I started using Blender assets in Unity, I was using a .blend for each part of a scene, too.
But after trying this several times, I find it way easier to position objects in Blender.
Compared to Blender, Unity has rather awkward controls. But of course, that’s just my opinion.
For now, I don’t really know how I can embed prefabs into that workflow, though.
Using prefabs would probably lead to a much better performance, right?

I haven’t actually tried UDK. In fact, I’m new to any sort of creating video games.
But when you just want to import one object you made from Blender to Unity,
you can always just export that one mesh as an .fbx, right?
Then you wouldn’t have to import the whole scene again. Or am I missing something?

I haven’t tried that yet, but I’m pretty sure one could just add standard cubes where there are light sources in Blender,
export them to Unity and then add lights in Unity where the cubes are placed.
Afterwards, the cubes could be deleted from the scene.

I thought the lightmaps will be added to the diffuse texture when exporting it from Blender.
So, when you bake a lightmap, you will get a… greyscale RGBA .png which defines the shading?
So that one could just add it as a second layer to the colored diffuse map and it will brighten the colors where needed?

Well, I do use Photoshop quite alot. That’s probably going to be the best method.
So far, I don’t know Cg. So it’s probably better to avoid the shader programming for a while.

By “AO” you mean “ambient occlusion”, right? Does baking ambient occlusion make a huge difference regarding performance?
Or is there no way to use ambient occlusion in Unity internal?

As you see, I’m not that experienced using Unity. But I try to make the most of it.

No problem :slight_smile:

Using prefabs would probably lead to a much better performance, right?

I’m not sure about performance, but if you make two objects with the same mesh data in Blender, Unity will still treat them as different meshes, which will increase file size. If you have multiple identical boxes (for example) you can decrease file size by using a different object. If these boxes show up in other levels, it will become even more of a pain.

you can always just export that one mesh as an .fbx, right?

If each mesh is a separate .fbx, yes you can do this, but I thought you were making your entire map in one file, in which case you couldn’t re-import just a piece of it. Are you building your scene in one .blend and then importing each object as a separate .fbx?

export them to Unity and then add lights in Unity where the cubes are placed.

Yes, this is possible, but you would have to recreate the lights’ settings, too. If you wanted, you could use emptys instead of cubes, and then just add a light component directly to the empty, then you won’t have to delete anything.

So that one could just add it as a second layer to the colored diffuse map and it will brighten the colors where needed?

Yes, this is entirely possible. However, you won’t be able to use seamless textures, because you will be changing the color of the diffuse. You can also say goodbye to using the same texture on multiple objects, since the light will be different. This will eliminate the possibility of dynamic batching (reducing performance), and increase file size since you can’t re-use textures.

By “AO” you mean “ambient occlusion”, right? Does baking ambient occlusion make a huge difference regarding performance? Or is there no way to use ambient occlusion in Unity internal?

Yeah, ambient occlusion. I have Unity free, so I don’t even have the option to use Unity’s ambient occlusion. The docs say it’s “quite expensive in terms of processing time” (http://docs.unity3d.com/Manual/script-SSAOEffect.html).

Just to make another point:

You do not want to build a whole map in blender and import into unity due to the massive waste of memory it will cause in your game.

If you build a map in blender, and have 100 chairs in the scene, and import that into unity… You get 100 meshes that unity thinks of as totally different meshes. Each one takes up memory to display. On the other hand, if you have one chair you have imported, and then place it in 100 placed inside of unity, it can save a lot of storage for the level. This ads up when your level has a lot of props, possibly making a level that could be stored in 50 megs take up over a gig of storage. This will also effect your frames-per-second, as well, though not as much as it will effect your storage.

One trick you can use it to make scripts that allow you to put null objects in your map when building in blender, and then have it automatically place a light or chair on the properly named null object inside of unity after importing. It takes some coding, but it can speed up your pipeline noticeably.

You also won’t have to wait a half hour for your level to import every time to change a small section, this way.

Another thing about beast lightmapping is that you have the option, once you get used to it, to set what resolution you want individual object to be mapped at, meaning the ability to fine-tune some objects to get low detail shadows, such as areas that the player will only see from far away.

Enjoy your weekend!

Thank you very much for your help, KnightsFan and FoolishFrost!!

This is very good to know! But I’m not actually planning on building scenes like that. If different objects are separate, then I make two actual objects out of them.
Most of the time, I make one object only out of connected faces.
Only sometimes I add maybe a wall to the floor even though they’re not connected.

Well, I just read about Blender being able to export complete scenes to Unity.
So I thought this may become a handy feature.
In fact, I do realize having separate .fbx files for the objects will make things easier
when trying to jump back and forth between Blender and Unity.
But that doesn’t mean I can’t create a complete scene in Blender, does it?
Because I don’t have to export the whole scene as an .fbx, but I can always choose
to export only the selection, right?

Also, on the other hand, one could export the whole scene and then, if you have to adjust something in Blender,
just delete the object from Unity, edit it in Blender and export the modified object separately to Unity.

Right, this is better! I just wasn’t too sure if empties could be exported to Unity or not.

Alright, I really didn’t think of that problem. And that problem is actually huge.
I will definitely need texture atlases for using textures all over the place.
Well, maybe I should take a look at Cg anyway. Or maybe I can use ShaderForge for that.
Recently, I ran into something really nice

This actually leads me to a further question: Do prefabs share the material?
For example, if I am using a prefab which has a material assigned, it will only load
the textures used in the material once, right?
But this will also be the case if I am using two separate objects, won’t it?

“quite expensive” doesn’t sound good. I don’t know, maybe I will give it a try at some point,
but I always try to stick to solutions that are optimized and not too expensive regarding performance.
I know, it’s not the same, but maybe Unity’s “directional light” could be a solution to this?
As far as I know, it illuminates the whole scene and it’s probably using less resources.

I’m not sure if KnightsFan meant pretty much the same.
But now I guess I understand what you are talking about!
I have to ask, though: Does it really matter that much?
Because most of the time, prefabs will be low-poly objects anyway which don’t use much space.
I could see a big problem if you don’t use the same texture on the duplicated objects.
If you duplicate the textures, too, it will surely lead to an insane amount of data.

If you’re using lightmapping all the time, this seems to be a good solution.
But after thinking about not being able to use textures multiple times,
I’m probably not going to use lightmapping too often.

This does sound nice if you’re going to use individual textures.
Are you talking about the feature that lets you dynamically control the texture’s resolution depending on the distance to the player?
Because I have read about that and I imagine, it could boost the performance, too.

Oh and one more thing:
Maybe someone knows how to do this:

Is it possible to “layer” textures?
Let’s say I want to use only one quad and a texture for a carpet in a scene.
How do I proceed? I guess snapping it to the ground and increasing its Z-value by just a little bit
so that the floor texture isn’t interfering is not the best solution.

Also, I changed the title of this topic.
It turned out that there are many questions following that aren’t necessarily related to Cycles baking and Beast lightmapping.

If different objects are separate, then I make two actual objects out of them.

This isn’t quite what I meant. I mean if you have two identical objects, like two cars that are exactly the same, ideally you could instance the same car twice–exactly what FoolishFrost explained with 100 chairs.

Also, on the other hand, one could export the whole scene and then, if you have to adjust something in Blender,
just delete the object from Unity, edit it in Blender and export the modified object separately to Unity.

Yes, this is possible. In my opinion, this would make a messy asset pipeline. If it works for you, though, go for it.

but maybe Unity’s “directional light” could be a solution to this?

Ambient occlusion is different from lights. AO simulates the bouncing of light on a small scale, which real time lights simply won’t be able to do in Unity.

This actually leads me to a further question: Do prefabs share the material?

Prefabs do not necessarily have materials. Prefabs are basically collections of objects and components that can easily be instanced multiple times, like groups in Blender. Mesh Renderer components are what technically have materials attached to them. A GameObject within a prefab might have a Mesh Renderer. (For example, you can have a prefab which is just a GameObject with a script attached, or a prefab which has several children with different materials).
So that’s a little pedantic, but might clear things up. Anyway, under certain circumstances, yes, you can batch objects with the same material. http://docs.unity3d.com/Manual/DrawCallBatching.html

I’m not sure if KnightsFan meant pretty much the same.

Yes, but FoolishFrost explained it better than me.

I have to ask, though: Does it really matter that much?

It depends on the project and platform. If you’re going for a webplayer game, yes, you want to minimize filesize as much as possible–no one wants to double their download time because 100 identical chairs are 100x larger than a single chair instanced 100 times. If you are developing for mobile, you need to optimize performance as much as possible. However making 10 identical chairs on a PC standalone game isn’t terrible, although it’s a bit silly to not optimize something like that.

I could see a big problem if you don’t use the same texture on the duplicated objects.

Textures are often ~70% of file size in my games, so yes, it’s a big problem if you don’t optimize texture usage.

Is it possible to “layer” textures?

Can’t you just put the carpet texture on the floor quad?

I recommend building a game with the knowledge you have, and intuitively learn the workflow that suits you best. Worry about performance when it becomes an issue. Doubtlessly you’ll have to start over a few times, but each time you’ll be better equipped to tackle the next issue.

Wow, thank you for all that advice, KnightsFan!

Alright, I get it now. I do understand this is unnecessary data being created.
On the other hand, I just built a chair for testing purposes and exported it as an .fbx file.
It was made of 108 tris and the file size of the mesh itself was 12kb.
If I use 50 of those chairs in a scene, there will be 588kb unnecessary data.
While I understand that this is a problem for web player or mobile games, I guess having a few mb more than necessary isn’t that crucial for local machines.
I thought about avoiding this, though. Because I do want to optimize games, especially if there are more or less easy solutions.
If I build up a complete scene in Blender, maybe I could export all the objects, create a prefab in Unity for meshes that are used multiple times and copy the location/rotation/scale values of the imported meshes that are placed in the scene. Afterwards, I could just delete the original meshes.

I know. It’s not ideal. I guess I would export all the separate objects as individual .fbx files for a project I’m working on for a longer time.
Also, when I select multiple objects and try exporting them as .fbx files, can I export an entire scene like that?
You know, let’s say I have a scene built of 200 objects and I select everything, can I quickly create 200 .fbx files for them?
Of course, by asking this, I mean objects that share their export settings.

Okay, I wasn’t aware of that. In Blender, I mostly use ambient occlusion for illuminating the complete scene to some extent.

I see. So I can use the same mesh using different materials and it will be stored in only one prefab?
Or do you mean, when I use the prefabs in the scene, I can set different materials to those?
Because, when I do that, will the materials and textures be loaded for each prefab separately or will Unity still be able to only load necessary textures that are used on multiple prefabs?

Actually, I am planning to make a game for Windows PC, Mac and maybe Linux.
But as you already said, one should probably optimize that even though.

Alright, this does seem a lot! Maybe I should read some more about optimizing textures.

Yes, but then I would have to create a carpet texture that already has the floor texture below the carpet.
It would be handy if I could place the carpet anywhere, no matter what is below the carpet while the floor can still be seen.
I figured a quad using a texture with an alpha channel may do the job pretty well, because the alpha channel could make sure the floor can still be seen.

Maybe some images can help explaining what I mean.
There is a carpet in Zelda Majora’s Mask:
http://www.wiitower.de/images/spielebilder/majorasmask1.jpg
Also, I guess the method I was thinking about could easily help creating a floor like this:

The texture on the floor below the log could be just a quad set into the scene and rotated 45°.
Or is it important to not create faces that are partially hidden?
I guess Unity will still render the whole face, even though most of it is hidden.
I don’t know if this will have a negative impact on the performance, though.

Of course I keep working on my project, but it’s always nice to hear about other people’s approaches.
I can probably make use of it at some point.

If I build up a complete scene in Blender, maybe I could export all the objects, create a prefab in Unity for meshes that are used multiple times and copy the location/rotation/scale values of the imported meshes that are placed in the scene. Afterwards, I could just delete the original meshes

Yes this is possible. It sounds like a lot of extra work to me, but if it works for you, go for it.

I see. So I can use the same mesh using different materials and it will be stored in only one prefab?
Or do you mean, when I use the prefabs in the scene, I can set different materials to those?
Because, when I do that, will the materials and textures be loaded for each prefab separately or will Unity still be able to only load necessary textures that are used on multiple prefabs?

  1. No, a mesh has material data inside it. Every object using the same mesh will have the same materials.
  2. All instances of a prefab will have the same components, including mesh and therefore material. You can always change one instance of the prefab to use a different mesh.
  3. The only way to save Unity from loading the same texture twice is to employ batching, which will happen regardless of whether an object is part of a prefab or not. http://docs.unity3d.com/Manual/DrawCallBatching.html

Or is it important to not create faces that are partially hidden?

I see what you mean about the carpets now. Yes, you could just add another quad and raise it a few mm. Some faces will always be hidden.

Alright, thank you very much for your help, KnightsFan!

I see what you mean about the carpets now. Yes, you could just add another quad and raise it a few mm. Some faces will always be hidden.

The only thing is that I don’t want the floor to be uneven. But I guess there is no other way to layer something like that easily.