Been looking into this a bit since it does seem to have some very useful possibilities, but I also see a few limitations the OP might not have anticipated.
First, since it’s not a “native” Blender function, it would have to be scripted, not a huge problem but probably not one that will involve nodes, since afaik PyNodes are a ways off(?)
Looking at the API for 2.48a (most current), there are two instance variables for the Image object that might help in this regard – animOffset and animStart, which I assume correspond to Offs: and StartFr: in the Image>Movie UI panel (if not I’m sure there is a variable that does – needs testing). Using Offs: & StartFr: in the UI, a movie can be set to play at any point in the Timeline starting at any frame in the movie, which seems to be the OP’s main goal.
However, these variables aren’t animatable via the UI, so some other means will need to be found, such as use of a proxy object in the Scene that will allow animation by “normal” means and return suitable values a script can use to set the movie Texture’s parameters. Example: An Empty (call it “MovieProxy”) is animated in two dimensions, X & Z, and X represents the value to plug into the Offs: variable while Z represents the value that the variable StartFr: uses. Conversion of numerical types will need to be done to match variable types (e.g., Float to Integer). The user can basically ignore the Empty in the Scene and use only its associated IPO curves (probably in Constant mode) to do the dirty work.
However, there’s a cloud attached to this silver lining in that at any one time during a sequence there may be a need to have different parameters in use simultaneously. If only one MovieProxy is used (and only one can be used per Texture/Material as I currently see it) this couldn’t be done, as the IPO’s cannot return two values at once from any one curve. The need for simultaneous values arises if you visualize what’s happening: From a movie that is a continuous series of images, say the first of two frame ranges uses MovieProxy frames 10 - 30. Immediately following that the same MovieProxy jumps to use frames 60-90 of the movie Texture. The abrupt transition from frame 30 to 60 of the image texture will be quite visible unless the movie is produced very carefully to prevent such “jump cuts” between portions to be used, a nearly impossible task, I’d think.
So, it seems likely that at least two essentially identical Materials using the same movie Texture would be required, and in addition to the MovieProxy modulations of Offs: and StartFr: there would have to be transitions in the Materials’ Alpha channels to basically do very fast cross-fades of the time-disjointed sections of the Movie Texture over one another as a way of disguising the jumps from one section of the movie Texture to another. Sort of a “A-B roll” setup, if you’re familiar with that editing terminology.
Does this sound like too much work for the resultant effect? Good question.
So far this all just a “thought experiment” based on a couple hours playing around with movie textures and digging in the API. No doubt there are problems I haven’t anticipated, but then again it may turn out to be easier than I have described if some genius scripter can shoot some holes in my reasoning.
It also sounds like an ideal task for the VSE – small segments of a movie placed in random time order with user-controlled transitions between – if only one could pipe the VSE output into a Material so the results could be previewed before rendering out the VSE content.
The Materials Nodes Editor’s lack of Time and Image input nodes such as those in the Compositor prevents using it for a similar approach.
A nodal approach, even if an Input>Time node existed for Materials, is also not practical because the Time nodes are so much less flexible than a true IPO curve; curves in Time nodes can be created but making them work with any precision is a mind-numbing task (I’ve tried a few instances of this) due to lack of accurate scales and restrictions in node workspace. The IPO Editor is made for this kind of thing. As long as I’m pipe-dreaming, how about a node for the Node Editor (Material and Compositor) that takes values from an IPO, like a super-accurate Time Node?