As you can see that is what it looks like for me all of a sudden, it’s like I’m seeing the inside faces rather then the outside ones. Is this something I did? Or is it a bug?
My second issue is the camera, I can’t see anything though it
In edit mode select all your verts and hit w ‘flip normals’
Select the camera, and go to ‘data’ tab and change the end clipping distance to 3000 or so.
Thanks, changing the camera clipping worked, but flipping the normals didn’t. If I look through the camera everything looks fine, but outside the camera everything is flipped. Strange,
Your camera is enormously far away from the object (in reference to all objects’ sizes) and you compensated for that by using some extreme sensor size (1 mm). Instead you should have changed the camera’s focal length.
So, either get the camera much closer and use some realistic camera settings or “zoom in” with the camera and all should work.
Other than that: It seems your object is non-manifold, that’s why recalculating the normals fails, as Blender can’t determine what the inside and the outside of your object is.
You’re approaching things backwards. You need to scale objects, rather than setting a scaling factor on the scene. The camera had a focal length of 35mm but a sensor size of only 1mm… it just doesn’t make sense. It all adds together to just confuse how blender calculates the scene.