Texturing Lidar with satellite photo

I am working with lidar (point cloud) terrain that I have imported to Blender from cloud compare.

In Blender I have imported raw ply file. Used it as reference to model plane that I have shrink wrapped on top of the point cloud.

Now I would like to texture the piece of land.

My problem is that the photo that I get out of google maps or bing doesn’t not align to the piece of land below due to google earth being WGS 84 Web Mercator vs whatever the point cloud used being different.

So question is there way I can make these two things match?

At moment the only idea I have is exporting uv unwrap and than manipulate the google photo in photoshop to match the lidar baked texture. Problem is that the process is hit and miss and extremely time consuming.

Basically how can I Georeferencing aerial Imagery?

Does anyone know any other method?

Thanks.

You don’t wanna hear this: has to be preperated before… :frowning:
Because there are different projections (mercator and …) the geo-data and image-data have to match in the first place or you have to know the projections and reproject to the wanted one… that’s a task which blender isn’t meant for… (but could be made… with work…lot of work) … but that’s what GIS (geo information systems) are made for…
(there is no info in the original lidar data?)