I have been playing with camera tracking for the first time today - and its pretty impressive.
I did however encounter problems with the tracking points and was wondering if there are any improvements in the pipeline for this function?
Aside from dabbling in 3d modelling - I am also an astronomer and do some astrophotography. Feature tracking within image sequences/movie files has been a staple within the amateur astronomy community for a number of years now - and is implemented in a number of freeware programs like Registax (http://www.astronomie.be/registax/). The feature tracking is quite advanced in such programs (I know some programs invoke things like fast fourier transform of the search/feature region) and can even handle low contrast features such as details on the moons surface or markings on the planet Jupiter.
I believe the developers of the camera tracking functions within blender may find it useful to look at how these programs implement feature tracking as there seems to be a lot of commonality in what Blender is trying to achieve vs what these programs can already do. The tracking features within these astro-imaging programs is quite mature - and there may be an opportunity to learn from them.