In cycle nodes, it is possible to add or do a few math operations on vectors, but strangely I have not found the way to simply define a vector through its x,y,z coordinates. Like a node that would have x, y, z as 3 inputs, and would output a vector. Seems simple. Any idea ?
Actually, the reason I would need this is that I want to introduce some randomness to a generated texture. Say I have this plane, with a procedural texture, a cloud texture for instance. I have several instances of this plane and I do not want the clouds on these planes to look exactly the same. But they do. My first idea was to use a vector mapping node, and add random x and Y displacement, but the vector mapping node does not have actual inputs for values such as X location. So second idea was to take the vector output from a “texture coordinate” node, either generated or UV, and do a vector math operation on it, adding a random vector, for which I would be using the “random” output of the “Object Info” node. It could work, if only I could build a vector output using 3 coordinates. Then again, maybe the whole approach is not the best, what do you think ?
As for the simple question of defining a vector with x, y, z coordinates as input, although I haven’t found such a “vector node”, it seems the Combine RGB node can be used, and Blender will accept the connection of a colour output into a vector input.
As concerns my underlying goal of introducing randomness to the mapping of a texture, here’s what I end up doing. Place a Mapping node between the “Texture Coordinate” output (be it generated, UV or other) and the
As regards the simple question of defining a vector with its x,y, z coordinates as input, within the nodes editor, there does not seem to be such a “define vector” node, but I found I could use the “Combine RGB” node for this purpose, connecting x, y and z into inputs R, G, and B respectively, then using the Image output as a vector, which I can connect to a vector (purple) input.
Now for the more interresting question of introducting some randomness in the texture mapping, here’s how I have done it. I add a “Mapping” Node between the “Texture Coordinate” (be it generated, UV or any other) and the Texture Node. Then add a “Vector Math / add” node that combines the texture coordinate output with some other vector before connecting into the Mapping. Now I have to find something to add to the vector. I use an 'Object Info" node. If I take the “Location” output, which is a vector, and add it to the “Texture Coordinate” vector output, then I get a texture mapping that depends on the location, which could be ok in some cases, but not if the object is moving. What I did that works best is to take “Random” output of Object Info and plug it as a vector, adding it to the “Texture Coordinate” vector. It seems using a value into a vector input works and considers the vector to be (x, x, x).
And for the wider question of working with randomness within the node editor, I found this very interresting tutorial that uses drivers and python random function.
There are a few ways to do what you want, The simplest being creating a empty node group with just a vector input and output, this will show you the vector input widget in the group node, but you can also construct your own vector by using the combine RGB Node, Colors and Vectors are both a vector of 3 floats, so this works, but is not intuitive at first.
Thanx Gexwing, indeed 3rd solution is the one I need, since vector coordinates must be connected as inputs.
And the idea of packaging it into a group is great, it hides the strange usage of a color for a vector.