I don’t see why the devs would have changed the generated UV on the road from BI to Cycles. It would break a lot of old stuff. Well, if they had a reason, they did it. So… That’s another question ask in the forum: Is there difference in between the 2? I honestly don’t know. I’ve really started to do renderings with Cycles on GPU. The nodes and the procedural materials are my thing. <3
For the rest, check on the Texture node, there’s a widget labelled “Flat”. That’s where you can switch to Box mapping. Flat means projected along the normals of the faces. That’s the usual way. Box mapping projects along the 3 local axes of the object. This was added for architectural work so that you can easily apply textures on the walls, floor and ceiling… which are like the faces of a box.
Camera and Window coordinates are really strange beasts… They works mostly in 2D. The texture is projected on the object from the camera. The difference is matter of scale. In Camera coordinates, the texture will keep its dimensions and be tiled to fill the bounding box of the object.
In Window coordinates, the texture is stretched or squeezed to fit the screen, independently of the size and distance of the objects.
I don’t see much practical use for the Camera except to create some cheap and easy mist in a static scene. And the Window coordinates can be used to keep an image in the background of whatever the camera sees. Any way, these 2 beasts shouldn’t have much more than very specific and limited uses on objects.
Note: The snapshots show 2 planes emitting cubes as particles in a grid pattern. The diamond-shaped emitter is farther behind.