Added Ion cross-post and pi pan-tilt article & images

This commit is contained in:
Chris Hodapp 2016-09-25 16:29:09 -04:00
parent 407ab43896
commit 2e7c7dc8e3
5 changed files with 116 additions and 0 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.7 MiB

View File

@ -34,6 +34,10 @@ continuing there full-time when I graduated in 2010. I did some
interesting work there combining software development and research interesting work there combining software development and research
around signal processing, image processing, and computer vision. around signal processing, image processing, and computer vision.
While at UC, I happened into some [cinci2600](https://cinci2600.org/)
meetings and eventually became a founding member of [Hive13][]. I'm
regularly found there, and on occasion give talks.
Around 2014 I moved to a startup, Urbanalta, and here I designed PCBs Around 2014 I moved to a startup, Urbanalta, and here I designed PCBs
and wrote embedded software. This is where I started using Haskell and wrote embedded software. This is where I started using Haskell
more intensively on the embedded side, eventually co-creating more intensively on the embedded side, eventually co-creating
@ -61,3 +65,4 @@ computational photography, machine learning.
[HaskellEmbedded]: https://haskellembedded.github.io/ [HaskellEmbedded]: https://haskellembedded.github.io/
[Ion]: https://haskellembedded.github.io/ [Ion]: https://haskellembedded.github.io/
[Hive13]: http://hive13.org/

View File

@ -0,0 +1,10 @@
---
title: Post at HaskellEmbedded - Introducing Ion
author: Chris Hodapp
date: September 23, 2016
tags: haskell, haskellembedded
---
Just a quick note: I finally released my Ion library (it was long
overdue), and wrote a post about it over at
[HaskellEmbedded](https://haskellembedded.github.io/posts/2016-09-23-introducing-ion.html).

View File

@ -0,0 +1,101 @@
---
title: Raspberry Pi pan-tilt mount for huge images, part 1
author: Chris Hodapp
date: September 23, 2016
tags: photography, electronics, raspberrypi
---
Earlier this year I was turning around ideas in my head - perhaps
inspired by Dr. Essa's excellent class,
[CS6475: Computational Photography][cs6475] - about the possibility of
making an inexpensive, relatively turn-key rig for creating very
high-detail photographs, ideally in HDR, and taking advantage of
algorithms, automation, and redundancy to work with cheap optics and
cheap sensors. What I had in mind had a pretty commonly-seen starting
point for making panoramas - something like a telephoto lens mounted
on a pan-tilt gimbal, and software behind it responsible for shooting
the right pattern of photographs, handling correct exposures,
capturing all the data, and stitching it.
My aim wasn't so much to produce panoramas as it was to produce very
high-detail images, of which panoramas are one type. I'd like it to
be possible for narrow angles of view too.
Most of my thoughts landed at the same inevitable view that this would
require lots of custom hardware and electronics, and perhaps from
there still may need a mobile app to handle all of the heavy
computations.
Interestingly, this whole time I had several Raspberry Pis, an
[ArduCam][] board, work history that familiarized me with some of the
cheaper M12 & CS mount lenses of the telephoto variety, and access to
a [hackerspace][hive13] with laser cutters and CNCs. Eventually, I
realized the rather obvious idea that the Pi and ArduCam would
probably do exactly what I needed.
A few other designs (like [this][makezine] and [this][scraptopower])
offered some inspiration, and after iterating on a design a few times
I eventually had something mostly out of laser-cut plywood, hardware
store parts, and [cheap steppers][steppers]. It looks something like
this, mounted on a small tripod:
[![](../images/2016-09-25-pi-pan-tilt-1/IMG_20160912_144539.jpg){width=100%}](../images/2016-09-25-pi-pan-tilt-1/IMG_20160912_144539.jpg)
I am able to move the steppers thanks to [Matt's code][raspi-spy] and
capture images with [raspistill][]. I put together some code to move
the steppers in a 2D grid pattern of a certain size and number of
points. (Side note: raspistill can
[capture 10-bit raw Bayer data][forum-raw-images] with the `--raw`
option, which is very nice. I'm not doing this yet, however.)
It's still rather rough to use, but it worked well enough that I
picked up a [25mm M12 lens][25mm-lens] - still an angle of view of
about 10 degrees on this sensor - and set it up in the park for a test
run:
[![My shot's not slanted, the ground is](../images/2016-09-25-pi-pan-tilt-1/IMG_20160918_160857.jpg){width=100%}](../images/2016-09-25-pi-pan-tilt-1/IMG_20160918_160857.jpg)
The laptop is mainly there so that I can SSH into the Pi to control
things and to use [RPi-Cam-Web-Interface][] to focus the lens. The
red cord is just Cat 6 connecting their NICs together; the Pi is
running off of battery here. If I had a wireless adapter on hand (or
just a Raspberry Pi 3) I could probably have just set up a WiFi
hotspot from the Pi and done all this from a phone.
I collected 40 or 50 images as the stepper moved through the grid.
While I fixed the exposure and ISO values with raspistill, I didn't
attempt any bracketing for HDR, and I left whitebalance at whatever
the camera module felt like doing, which almost certainly varied from
picture to picture. Automatic whitebalance won't matter when I start
using the raw Bayer data, but for the first attempt at stitching, I
used only the JPEGs which already had whitebalance applied.
I stitched everything in Hugin on my desktop PC. I would like to
eventually make stitching possible just on the Raspberry Pi, which
isn't *that* farfetched considering that I stitched my first panoramas
on a box that wasn't much more powerful than a Pi. I also had to get
rid of some of the images because for whatever reason Hugin's
optimization was failing when they were present. However, being able
to look at Hugin's computed pitch, yaw, and roll values and see
everything lining up nicely with the motion of the steppers is a good
sign.
The first results look decent, but fuzzy, as $10 optics are prone to
produce:
[![](http://i.imgur.com/zwIJpFn.jpg){width=100%}](http://i.imgur.com/zwIJpFn.jpg)
More posts will follow soon on this!
[cs6475]: https://www.omscs.gatech.edu/cs-6475-computational-photography
[ArduCam]: http://www.arducam.com/camera-modules/raspberrypi-camera/
[hive13]: http://hive13.org/
[makezine]: http://makezine.com/projects/high-resolution-panorama-photography-rig/
[scraptopower]: http://www.scraptopower.co.uk/Raspberry-Pi/raspberry-pi-diy-pan-tilt-plans
[steppers]: https://www.amazon.com/Elegoo-28BYJ-48-ULN2003-Stepper-Arduino/dp/B01CP18J4A
[raspi-spy]: http://www.raspberrypi-spy.co.uk/2012/07/stepper-motor-control-in-python/
[forum-raw-images]: https://www.raspberrypi.org/forums/viewtopic.php?p=357138
[raspistill]: https://www.raspberrypi.org/documentation/raspbian/applications/camera.md
[RPi-Cam-Web-Interface]: http://elinux.org/RPi-Cam-Web-Interface
[25mm-lens]: https://www.amazon.com/gp/product/B00N3ZPTE6
[Hugin]: http://wiki.panotools.org/Hugin