Wednesday 21 January 2015

Exploring the NASA-GIBS Service with Leaflet and Shiny

NASA publishes some excellent imagery and gridded data in near real time through the Global Imagery Browse Service (GIBS) service.  I want to explore this service, and in the process develop my proficiency with Leaflet and Shiny.

RStudio has released a package to bind Leaflet to Shiny page, built on their htmlwidgets .  Using this package, it is trivial to write a shiny app that contains a leaflet map.  To begin, we need to install the leaflet package from github:


Shown above is a basic shiny program to display a slippy map using open street map tiles, the live example can be explored here.  So how hard is it to get those GIBS layers in there?  Looking at the NASA Earthdata map viewer we can note some issues:
  • most layers have a date component to the URI
  • not all layers have the same available date range
  • not all layers are available in all projections
  • some layers are intended to be overlays

Sticking with web mercator (EPSG:3857), the first task is to sort out all the available layers. Here is some code to read the XML capabilities file and parse out the useful bits:
It appears that the file type (png or jpeg) is the key to determining if the layer is a base layer or an overlay. The other bits that need to be pulled out are the maximum zoom factor and available date ranges. Using these fields, we can specify the layer(s) we want to show with the following shiny code segments:
Some hard-coding done here to select the coastlines overlay, and the calendar range was set by inspection of the current dataset. Here is a live version of the application.

The source code for this application is available at gihub:

Sunday 8 September 2013

Some successful flying

This weekend we got some encouraging flight results with the hexacopter.  After making some modifications to the camera mount and spending some time building confidence with the flight simulator, we were hoping to get some usable photos and maybe even work up the courage to try a flight with the autopilot in control.

Site:  Carstairs farm
Conditions:  Sunny, light breeze from the west.
Camera:  Canon Powershot A2300, shutter speed set to 1/2000, ISO 100
GPS:  uBlox LEA-6H

The first flight went well, with the exception of the landing, which was done with only five propellers.  Surprisingly, there was little damage to the copter (one arm needed to be replaced, no props broke).

The photos are not terribly interesting, unless you like looking at targets on the lawn.  Here is a close up of the resolution target from an altitude of roughly 15m.:
On the second flight we tried the autopilot, with a modest program of four waypoints.  The results were not that good, likely because I entered a radius of 2m for each waypoint which is too tight for the system to lock on to.
The photos have some more interesting subjects, but we had to curtail the flight since the autopilot was not cooperating.  Looking at the flight logs, it's clear that the problem is that the waypoint radius is too small.


The third flight didn't really have any goals, other than to do some more flying.  I suppose if we had been more ambitious we could have reprogrammed the waypoints and tried the auto mode again.   But instead we just let the autopilot fly in loiter mode, into the trees.
oops
The photos are more of the same.

Some concluding thoughts:
  • I think the camera mount is providing reasonable results given the quality of the camera.  It would be nice to improve the CHDK script so that the shutter speed selection would be handled automatically
  • The next flight with the autopilot needs to have a better program, we need to spend some time understanding the options and their impact on the system, specifically the waypoint radius and delay values.
  • The flight simulator is a great tool for improving pilot confidence, but more practice is definitely required, particularly for landings.

Tuesday 27 August 2013

FTDI Board? We don't need no stinking FTDI board!

Connecting to GPS receivers to test, configure and log data seems to be a daily thing this month.  The way we have been doing it is to connect from the computer USB to the UART on the GPS using the handy Sparkfun FTDI breakout board.  


For example, we can connect to the uBlox LEA-6H using the uCenter software to set the configuration, following these instructions.

But the goal is to get the GPS working with the APM, which has plenty of UARTs.  So, how about we just talk directly to the GPS using the APM?  Easy:
1)  Install the patched version of the Arduino development environment, following the instructions here.
2)  Connect to the APM via USB.
3)  Open the Arduino IDE and enter the following code:
/*
  Mega multple serial test
  Receives from the main serial port, sends to the others. 
 Receives from serial port 1, sends to the main serial (Serial 0).
  This example works only on the Arduino Mega
  The circuit: 
 * Any serial device attached to Serial port 1
 * Serial monitor open on Serial port 0:
  created 30 Dec. 2008
 modified 20 May 2012
 by Tom Igoe & Jed Roach
  This example code is in the public domain.
  */
void setup() {
  // initialize both serial ports:
  Serial.begin(38400);
  Serial1.begin(38400);
}
void loop() {
  // read from port 1, send to port 0:
  if (Serial1.available()) {
    int inByte = Serial1.read();
    Serial.write(inByte); 
  }
  // read from port 0, send to port 1:
  if (Serial.available()) {
    int inByte = Serial.read();
    Serial1.write(inByte); 
  }
}
Note the baud rates are set to 38.4K, which is the standard for the uBlox connection to the APM.
4)  Click the 'Upload' button to compile the code and run it on the APM
5)  Now open uCenter and connect to the appropriate COM port at 38.4K baud.
6)  Do your GPS configurating.

Note that you will need to reinstall the firmware on the APM using Mission Planner when you are done.

Tuesday 26 February 2013

MODIS Image Processing Stream

A MODIS Image Processing Stream

Inputs:

  • MODIS mosaics (eg Arctic, Antarctic)
  • Data Flow?
  • Image granules from Rapidfire
Products:

  • Time lapse movies
  • Custom images/mosaics
  • Cloud cover masks
  • Motion vectors
Components

  • Data retriever, per collection (may require metadata collector)
  • MetaDatabase & Metadata producer
  • Cloud mask tool
  • Movie composer
  • Optical flow detector/filter
Currently:

  • Data retriever is a batch file/php for the arctic mosaic, and Watchdog/WatchFTP for granules. Good enough for now, but we need to have a tool to decide if new data should be downloaded and how.
  • Metadata producer/Database - nothing
  • Cloudmask tool - I have an R script that will process arctic mosaic tiles and create a fair result.  For granules, the MOD35-L2 product is available.
  • Movie composer, can be done with batch scripts with varying levels of sophistication using Imagemagick, Mapserver, gdal, and ffmpeg (also sed).  It would be useful to have a web front end for specifying and managing requests and products.
  • Optical flow detector/filter - combination of OpenCV and R at this point, could be batched but needs work.

Friday 22 February 2013

Canon A2300 Mount

Here is a simple mount for the Canon A2300 camera, constructed from an aluminum bracket salvaged from an old scanner plus a piece of angle bracket.  Weight is 186g, including the bracket, fixing bolt, camera and release cable.

I used some foam weatherstrip to pad the part that touches the camera, but I doubt it will reduce any vibrations between the bracket and the camera. 

Still to be addressed:  the connection to the airframe.  It would be interesting to use some sort of shock absorbing link there, not sure what would work...

Canon Camera Shutter Release Cable

Using a Canon handheld camera with the APM 2.5 looks to be an easy low cost method of collecting some aerial imagery.  We have a couple of suitable cameras to try out:

Canon Powershot A2300:  16MP, $100 at Walmart
Canon Powershot SD800IS:  7.1MB, $25 off Craigslist

Both cameras are supported by the Canon Hack Development Kit (CHDK) which adds two valuable features to these low cost devices:  scripting and a remote shutter release via the USB port.

The remote shutter release simply monitors the presence of a voltage on the USB power pin.  Following the instructions here, I made up a test cable using a battery and a momentary contact switch.  Testing with various voltages, it seems to need at least 4.1V to reliably trigger the shutter.
CHDK Remote Cable Test Rig W/4.8V Battery


To connect to the APM, we'll need to use one of the digital I/O pins (3.3V) as a trigger, with voltage supplied by the 5V supply pin.  Testing with the battery indicates that the current drain is negligible, so here is the circuit we came up with, based on a 4049 hex inverting buffer:
CHDK Camera Circuit
Here's the completed circuit, needs some strain reliefs, heatshrink, 
A small ceramic capacitor is bridged across 5V-GND as a buffer.  On initial testing, it works!  Next steps are to test with the APM and fine tune the scripting on the camera and autopilot.
Many thanks to my friend paulb for the expert advice, parts, soldering, and correction of errors.  I did, however, supply the beer.

Here is a discussion and some code for scripting the camera.