Hello Blender Heads!
I’ve been thinking a long time, that it’s possible to create images from vertex colors, or the other way around.
First of all I need to know:
How to place color in a specific UV space, is it the same system as vertices, do you use xyz or rather is the UVW the same way as XYZ?
Is UV completely independent from the faces’s position?
Can I use the save texture in diferent objects, with a different image? Actually, can I use a base texture as canvas(without saving the result) to paint the vertex(or list) based image?
This question was made possible thanks to a “realtime painting” thread.
The idea here, is to create detailed images from a low detail vertex color map!
Hello Blender Heads!
Vertex colour to texture http://screencast.com/t/x4UTSnpqtTB
You can bake a texture to vertex colour using an older version of blender http://4museman.wordpress.com/2009/07/09/texture_to_vcol/
to bake texture to vertex paint with recent blender you can use Dynamic Paint in canvas mode: load a texture as initial color, generate paintmap layer in output and apply the modifier…
OK, guys, thank you for your replies! I have to correct my title because, this is meant to happen in real time, during gameplay! The vertex colors will be generated automatically and they should, with a few tweaks generate a dynamic texture.
You can read the UV coordinates from a vertex. XYZ are coordinates in 3D Space while UV are coordinates in Texture space. This is like two parallel universes ;).
I think you have two issues to solve:
- transform 3D Space coordinates into Texture coordinates (XYZ->UV)
This is very easy for vertices as they already provide you with the UV.
With the UV you can access the pixels within the texture space.
I never did, but I guess this is somehow possible with Videotexture.
It is more difficult to do that for any other point e.g. an hitpoint. The point MUST be on a face.
I think you can linear interpolate the UV coordinates between two (three) vertices. But I’m not that sure.
Points not on a face do not belong to a texture space
- texture space pixels to 3D position (UV->XYZ)
Here you can get multiple points in 3D space. This is because each face defines it’s own UV space. Therefore you have to check each single face.
The method is the same:
- read the XYZ of the vertices
- interpolate the UV that do not match a vertex -> Interpolate the XYZ of the vertices.
I hope it helps a bit.
Thanks, it does help!
I’ll do more research, because, what I need is to add an array of pixels in a face or just use a list of vertex color representation to convert to pixel.
The usage will be obviously for planetary texturing. Planets when seen from a certain distance wont have as much detail as closed ones. But the altitude data can be used to generate mid quality images, on the fly. This allows me to avoid having various textures to apply to randomly generated planet!
If that is a bad Idea, I’ll have to use a texture per digital altitude map!
Baking works well for small models.
I have a huge model, 3,028,495 faces with vertex color.
I want to convert it to a 64,000 faces model with UV transferred from the vertex color.
The problem is size.
I always get memory errors, just copying the model.
How do I handle that in Blender 2.8?