I am working with lidar (point cloud) terrain that I have imported to Blender from cloud compare.
In Blender I have imported raw ply file. Used it as reference to model plane that I have shrink wrapped on top of the point cloud.
Now I would like to texture the piece of land.
My problem is that the photo that I get out of google maps or bing doesn’t not align to the piece of land below due to google earth being WGS 84 Web Mercator vs whatever the point cloud used being different.
So question is there way I can make these two things match?
At moment the only idea I have is exporting uv unwrap and than manipulate the google photo in photoshop to match the lidar baked texture. Problem is that the process is hit and miss and extremely time consuming.
Basically how can I Georeferencing aerial Imagery?
Does anyone know any other method?
Thanks.