More updates to 2016-10-04 draft

This commit is contained in:
Chris Hodapp 2016-10-04 23:23:41 -04:00
parent a70665b03f
commit 7ae40aa751

View File

@ -28,15 +28,50 @@ information will deviate because:
is much more annoying to account for.
[No, it's not the nodal point. No, it's not the principal point.][npp])
That is, the position information we have is at best a guess; it's not
sufficient on its own. However, these tools still do a big numerical
optimization, and a starting position that is "close" can help them
along, so we may as well use the information.
That is, the position information we have is subject to inaccuracies,
and is not sufficient on its own. However, these tools still do a big
numerical optimization, and a starting position that is "close" can
help them along, so we may as well use the information.
Also, these optimizations depend on having enough good data to average
out to a good answer. Said data comes from matches between features
in overlapping images (say, using something like [SIFT][] and
[RANSAC][]). Even if we've left plenty of overlap in the images we've
shot, some parts of scenes can simply lack features (like corners)
that work well for this. We may end up with images for which
optimization can't really improve the estimated position, and here a
guess based on where we think the stepper motors were is much better
than nothing.
(TODO: Stick a photo here to explain features? Link to my CV text?)
If we look at the [PTO file format][pto] (which Hugin & PanoTools
use), it has pitch, yaw, and roll for each image. Pitch and yaw are
precisely the axes in which the steppers move the camera (recall the
pictures of the rig from the last post); the roll axis is how the
camera has been rotated. We need to know the lens's angle of view
too, but as with other parameters it's okay to just guess and let the
optimization fine-tune it. The nominal focal length probably won't be
exact anyhow.
Helpfully, PanoTools provides tools like `pto_gen` and `pto_var`, and
I use these in my script to generate a basic .pto file from the 2D
grid in which I shot images. The only real conversion needed is to
convert steps to degrees, which for these steppers means using 360 /
64 / 63.63895 = about 0.0884, according to [this][steps].
With no refining, tweaking, or optimization, here is how this looks in
Hugin:
(supply screenshot here)
TODO:
- We're using Panotools and our apparatus together; they can
cross-check each other.
- Our position info also turns readily into a .pto file which Hugin
can visualize.
- Conversion for steppers; axes that Hugin/Panotools use and what I
use
- dcraw conversion?
[ArduCam]: http://www.arducam.com/camera-modules/raspberrypi-camera/
[forum-raw-images]: https://www.raspberrypi.org/forums/viewtopic.php?p=357138
@ -46,3 +81,7 @@ along, so we may as well use the information.
[PanoTools]: http://wiki.panotools.org/Main_Page
[entrance pupil]: https://en.wikipedia.org/wiki/Entrance_pupil
[npp]: http://www.janrik.net/PanoPostings/NoParallaxPoint/TheoryOfTheNoParallaxPoint.pdf
[steps]: https://arduino-info.wikispaces.com/SmallSteppers?responseToken=04cbc07820c67b78b09c414cd09efa23f
[SIFT]: https://en.wikipedia.org/wiki/Scale-invariant_feature_transform
[RANSAC]: https://en.wikipedia.org/wiki/RANSAC
[pto]: ???