I am trying to create a shader that follows the camera, and uses a texture that follows the pixel coordinates on screen. The best and only way I know how to do this is by plugging in the Window socket of the Texture Coordinate into the Vector socket of the Mapping node, then plugging that into a texture. However, the Window socket gives a value from 0 to 1, rather than pixel coordinates, so the texture ends up being warped based on the window size. Is there a way to get window size in shader graph or another way to work around this problem?
Use Map Range nodes- separate the Window coordinates with a Separate XYZ, then map the X to your width and the Y to your height with map ranges. Recombine with Combine XYZ