Hello. Do you happen to know a game called VRChat? As its name suggests, this game is an open world multi-platform game for VR and PC users, and its main content is to have a very high degree of freedom and talk to others or enjoy mini-games supported by the world. VRChat’s users can upload their own world or avatar using the Unity engine.
I am a blender user and I am also interested in the world production of these VRChats.
The above videos are examples of vrchat world
Most of them are cartoon-style graphics, but the world I want to create is a very realistic interior scene like ‘09:48 Fast Internet Airbnb’ on the third link.
So I’ve been thinking for a long time about how to take the Blender interior scene to Unity as close as possible, although it’s probably impossible to do the same as I saw it in cycles.
Then I suddenly thought that if there was a way to 3D scan and export objects and textures inside the blender in a cycle environment like iPhone or 3D scanner into a blender, it would look very realistic like it would be rendered in cycles.
Actually, I bought and used an add-on to help with cycles bake on blendermarket before that, but I was sad that the result I wanted didn’t come out.
Maybe there’s already a way I told you?
Just to get that right, Do you want to “scan” a 3D scene inside Blender or do you want to scan a real life object and then import it into Blender in order to bring it into Unity?
If the former, then that is not possible in the way you probably want to do it. You mentioned that you tried baking it but the results were not as good as in Cycles. The reason why that is the case is that a lot of the render passes ( stuff like reflection, refraction) are view point dependent. So if you move or turn your head they have to be re-calculated.
I mean, it is possible to bake reflections and things like that into your maps but it will look wrong as soon as you move to a view point which is not exactly the same as the viewpoint from which the reflection was baked.
Oh, thank you for your reply.
Since I don’t know English, I wrote this article using a translator, so there may be some awkward parts in the context, but the exact expression I intended is ‘export’.
A typical workflow is “scanning real-world objects with a 3D scanner → import them into a blender and use them to construct scenes.”
The way I’m looking for is “Complete Blender Interior Scene → Export it as close as possible to what you’ve seen in cycles → Bring it to Unity and use it.”
Of course, it is impossible to bring 100% of Blender cycles’ interior scene to Unity because Unity and Blender have different engines and different render environments, but if there is a way to scan the interior scene inside the blender, I think it will be possible to export as much as possible.
The term 3D scan I mentioned is just an example, not that you actually have to do 3D scan.
What I need is a way to export the interior scene of the blender cycle as close as possible to what I saw in the cycle. Therefore, I felt that this should be accompanied by overlapping the texture with light, shadow, reflection, and indirect light in a very realistic cycle rendering environment, so I tried diffuse texture baking in cycles, but I didn’t get the desired result, so I’m trying it for the second time.
Photogrammetry is the vocabulary you are searchign for…
Just search for 3d scan, photoscan, photogrammetry maybe combined with easy or blender… here in the forum ( top right)… or any search engine you like…
And of course you need a lot of images with a lot of resolution so you need a lot of RAM and a lot of computer power and so a lot of time and last but not least a lot of experience to make something nice
Oh, I’m sorry for the late reply. I didn’t see your writing when I came earlier…
I want to do ‘scan’ inside the blender!
That’s right. In reality and cycles, as you said, render passes such as reflection and refraction vary depending on the point of view, but there is a limitation that if you do 3D scan or diffuse texture baking, that’s not the case. But that’s all right. Rather, that is the result I want.
And I already knew that when you 3D scan a real object like the latter and bring it into the blender, you get that result. Like this garage 3D model. It’s a shame that reflection and refraction do not change depending on the time, but it’s very realistic, and the basecolor texture already includes light, shadow, reflection, and indirect light in real-time lighting, so doesn’t it look usable in a real-time rendering environment?
There are other examples. This link is a vrchat user who introduces impressive places in the world he has been to. Among them, the world called “4. The Hallwyl Museum” is a 3D scan of a real museum, which is very realistic compared to other worlds. I want a similar result.
All of the data that came out earlier are scans of the real world, but I was wondering if there was a way to export the blender scene that I made in a similar way.
Yes, that’s right. I think you summed it up exactly
I’m Asian who doesn’t know any English, and I used a translator, so I think I confused you because there was a sentence that didn’t fit the grammar
It wasn’t the information I was looking for originally, but when I moved data from Blender to Unity, I always only used the .fbx format, and I found out for the first time that I could put the .blend file directly into Unity and use it. It’s a shocking piece of information
I already know most of the other contents such as apply and origin, but I also found out for the first time that you can easily organize the normal with Mesh > Normals > Recalculate Outside
It was a great help. Thank you!!!