photogrammetry on Blender

how can we work with photogrammetry modeling on Blender, and where from can we find documentation about this?

Blender can’t really do it.

You can look at Insight3d and Photosculpt, maybe one of those is for you:
http://insight3d.sourceforge.net/
http://www.photosculpt.net/

123catch (beta) is free at the moment and creates a textured mesh from series of photographs.

http://www.cs.washington.edu/homes/ccwu/vsfm/

Lemme preface my comments by saying: I’ve not tinkered at all with the new camera tracker in blender (yet), but was wondering if this could be used to create geometry from a “flyaround” clip of a subject, by just tracking a bunch of points… doesn’t it create geometry points at track points, or did I see those intro clips wrong?

Regardless, its my opinion that this function could be coded in on top of whats there, if the coder had the skills… would be real handy!

FYI: i have implemented photogrammetry to CADtools recently:

  1. start with two (or more) photographs of your object
  2. CAMERA_MATCH will help you to synchronize cameras/viewpoints to them as background images
  3. then you can trace 3d points with help of SNAP-mode:“RAY”: simply pick point by point on both photographs

I tried with the new BLAM script that solves a camera from a static image (with x/y parralel lines) And it worked ok. I modeled a city scape from the camera view, but found the buildings diverged when in plan view (google earth photo).

EDIT:
I might try 123D (formerly photofly) from autodesk in conjuction with the BLAM solve. I wonder how far apart I can shoot projections to get a reasonable recreation?