It depends entirely on the developers. The thing is, because the commercial apps are being developed by the big companies, they are always going to get the bleeding edge code that pushes the boundaries of visual effects. The lag has been seen when comparing Blender to Instinctive Blender. The latter is developed in a company that uses Blender for its 3d work.
I believe Blender could make it eventually but only if some 3D gurus in animation work on the code or at least their code filters into the open source world.
In it’s present state, I think Blender might be falling a little short of what is required from a 3D package. Until those features are included, big companies would be hesitant to adapt their workflow to include it.
Tight integration with a first class renderer is a very big issue. It might be fine to do a small animation with some good quality but try to do a cinema quality movie with no artifacts and you could easily run into problems.
I think the main problem would be materials or shaders. Both yafray and Blender have fixed shader models that makes them fairly limited in what is achievable. For the required flexibility, some sort of unlimited hierarchical (Maya) or programmable (Renderman) shader model is essential. You need both that and the renderer so that you can render everything you ever saw.
Blender certainly has the potential but it needs some work. Even if that’s just some customization. But NURBs need quite a bit of work. But only the developers can really answer this question because they will have ideas about where they can reasonably take Blender with their 3D and development knowledge.