Why the blender devs or the community not ever thought of this?

I have posted about this on the right click select website, but i want express myself that i have no ill intent of promoting my article so those who have come upon it, please note that im not promoting my article, i just want some little discussion around this idea

So now on to the topic why has anyone not thought of implementing transform orientation in the uv editor


like its so simple ( coming from me who has no coding experience nor practice)

i got this idea from using the useful rmkit addon to edit my uvs

Same thing with proportional editing in pose mode

I want your thoughts on such simple yet useful changes(more like tiny tweaks ) in blender

Yea why not ? could come in handy for manipulating UVs.

Do you mean you have a working prototype ?

1 Like

Well i don’t have one :sweat_smile:

Well… a face in 3D do have a normal and to give the other two directions one could use the Z axis to get some “direction”. How do you think in 2D you would define the two “directions” of any face ???

For a rectangle this might look easy but for example for this face…

… what do you think should be the face depend U-V axis now ??

So properly noone did think of it because it is geometrical not defined and so not usable/ implementable by any algorithm.

2 Likes

Right angle to a given edge?

Well… of course one could use the indices of the vertices using and then use the first two vertices as the face dependent U direction… yes. But then also an almost identical polygon with some other vertex index may have some other “direction”…

But there is also another question:
In 3D this face orientaion is used to build up other geometry orientated the same way as the face used and especially the same XY plane. In 2D we do not need this. The only usecase which came to my mind is when using something like a brick map but the texture is slanted and one want to push the UV along this slant… but it would be better to de-slant the texture…

So… implementing this also in 2D would be extra work to do… and so simply:

Someone thought of it but there was no request for it and there were other things to do.

 


P.S.: in fact when having different polygons on the XY plane and defining a view transformation for everone they do have all the same (or opposite) normal but different orientations… so the decision of making some (different) direction is already somehow implemented but may lead to different result according to vertex indices… but simple not used in the 2D implementation… :person_shrugging:

So if someone wants to maintain this code…

1 Like

Sorry for the late reply :sweat_smile:i was business with life as we all are these day’s but here is my reply


you see how blender calculated a perfect median point using the normal’s so hypothetically the same can be applied to uv’s but without normal’s

I’m not a programmer so i can’t get into the technicality of things , if i was then i could give a detailed answer

As i said: in 3D space somene has the normal, but in 2D ( UV ) space all normals are the same; they are all pointing upwards of the euclidean plane makeing it 3D.

So your are giving an example in 3D and think this is applyable in 2D but this is a fallacy… There is no prove by example in math… (especially when using something more mighty for the example so the example can’t work in the lesser structure).

[ You can’t prove that "every human speaks english" by example with presenting one single human who speaks that language. But you can only dis-prove the statement "no one speaks english" by a counterexmple presenting one single human who does. ]

2 Likes

Try to turn this into an implementable idea that can then be submitted to the development team. And, maybe, this is a very precise (refined …) description of your situation … “not how to do it.”

1 Like

There is a misunderstanding i’m not actually telling how it should be done or how the idea should executed . i’m just showing people this is the idea, i’m using my screenshots to express my idea

I don’t know the technicality of it so I agree it’s better to leave this to the development team

But an idea alone will lead to nothing…

Ideas by themself are useless if there is not some thought included to put this into practice.

So… when the questions

  1. For what exactly is this useable/valuable/needed (use case) ?
  2. How should this behave repeatable in 2D ??

…aren’t answered properly or even unanswerable then this is (you might not want to here this) simply a phantasm… :person_shrugging:

( For 1.)… i never heard of any UV editor where you can do this in any major 3D app… but someone might know ?)

I think it would be easy enough to derive a vector from two selected UV elements and use that as a custom transform orientation. But indeed you can’t do that from a single face, you need two separate things

1 Like

You just need a perpendicular vector from a line. It’s easier to calculate in 2D than 3D. The hard part is the user interface, which might already be there in Blender? The biggest problem is that you will be generating new geometry and how that would work, I’m not sure?

UV Space is U 0-1 and V 0-1. You could possibly calculate any vector from that. You might not need to use the geometry.

Select any one point followed by a second one and simply infer the angle as a ratio for example:

Point 1:

U= 0
V= 0

Point 2:

U = .5
V = .5

This would give you a 45 degree angle.

But the formula would have to work with any two inputs along U and V and against the two inputs along the second U and V.

I am not good at higher math. But I am pretty sure some genius at math could do this on a napkin. It’s about calculating angles from line distances.

The only data you would then need for manipulation is the resultant angle.

Blender has Selected and Active as data points that am sure could be referenced in code to then calculate this angle.

I just don’t have any idea how this could be implemented in any way as a code that would work to constrain transforms within UV editor or if it is even possible.

Nor do I understand the need for it.

But I am pretty sure this math would work.

Or you might even be able to simply use a ratio. Ratio of distance U against distance V from the first point.

Edit: this is the math I am referring to. You could draw a right angle in space from any one UV point on the grid to a second one. Get the length of the sides and you have your angle.

If you need a perpendicular tangent from that just add + or- 90 degrees.

Of course someone could pick any vertex of any polygon and get any “direction” by "the next one.

But this “any” make this quite random and as i mentioned above:

In 3D space the orientation of a polygon face makes sense according to the wish to add some face in the same plane or at an specific angle to one of the edges to contruct some geometry.
But what for does someone need a specific direction along one edge of a face in UV space ???
If not along the U or V axis and maybe in the diagonal (doable with the grid). Usually someone wants consistent texel density and almost no texture distortion and so mostly the UV-unwrapping needs sophisticated UV-unwrapping algorithms and users with some experience.

Don’t you think that if something simple as “moving along any UV edge” would solve so many problems someone would have thought of that already ??

So every developer in any software project will not ask how easy some “feature” is to do but how much this is needed by the majority of users so that he can spent his time effectively on needed things.

As i also mentioned: “profesional” tools (for example Maya UV, Modo ) or some additional addons (for example TexTools, RizomUV) do not have this but i might have overseen this.

So if developer spent some time to make those UV-editors, and no “customer” requested such a “simple” method nor any manager wanted such thing to “increase” the feature list then the reason simple might be: it’s not needed nor soo useful ?

And yet… there was no “presentation” of a use case to show where this would be handy…

So:

  • For moving an UV-face along a UV-edge of that face this would break the texture continuenity of the texture.
  • For moving a UV-vertex of a UV-face along a UV-vertex :
    – if this happens between tow UV faces this would again break the texture continiunity …or otherwise
    — someone can set the 2D-Cursor to the other vertex of that edge and use it as pivot point and scale it…
    — scale the whole edge using median point as pivot point.
  • For moving an whole UV-island along some UV-edge: this makes only sense if the island have some “edge-direction” in common but like the first point: would break the texture continuenity.
1 Like

I agree. Not much of a use case in my opinion. Just found the math problem as an interesting exercise. Considering you want it relatable to some aspect of the UV island. Otherwise all you’d need is a dial.

By “dial”… you mean something like “move along a certain angle” ? Or (simply) translation ??

Well… the more univeral mathematical functions are the transformations; especially the most used ones: move, rotate, scale and sometimes shear…

And this also migth have a reason: Most artists simply do not use any other and simple move it “by eye” ? This may be different if you talk to a CAD user.

( That’s also the reason why i used the words artist and user :wink: .)

Pretty sure it did get discussed back when transform orientation first got introduced. In conclusion nobody was against the idea but there just was nobody around to implement it.

Note that UV editing in blender is implemented in an entirely it’s own context and it shares very little with the 3D modeling editor. So no tools/resources code sharing here.

Would need basically implementation from scratch.

2 Likes

I don’t know if this was discussed before or why also other apps don’t do that ( maybe people and also devs are simply used to do so ?? ):

For the 2D editing of the UV the 3Dview couldbe used but simple “pinning” the view like in the quad view (where Lock Rotaton in enabled by default)… so one could use ever 3D transformation feature restricted to 2D

(
there is already the possibility to transform the 3D topology into it’s UV-map representation via GN like this:

)

Yeah I’ve brought up Silo as an example of that in the past myself. But I presume that would also require a very extensive rework of the 3D editor to enable it to be used and integrated that way.