Today our guest blogger, David Garrison, will continue his series on MathWorks involvement in the 2017 solar eclipse.
- Part 1: The Citizen CATE Experiment
- Part 2: Training the Volunteers
- Part 3: Rehearsing for the Eclipse
- Part 4: Imaging the Eclipse
Here is Part 3 of the Series.
In Part 1 of this series, I discussed MathWorks participation in the Citizen CATE Experiment - a citizen science project to image the 2017 solar eclipse. In Part 2 of this series, I described the volunteers, the equipment they will be using, and how they are being trained.
In this post, I will be joined by Andrei Ursache - another of our MathWorks Citizen CATE volunteers. Andrei is an Application Engineer who works with Image Acquisition Toolbox and Image Processing Toolbox. I will start by giving a brief description of what the volunteer teams are doing to prepare for the eclipse. Andrei will then discuss the MATLAB Solar Eclipse App that will be used by the volunteers and what will happen during and after totality.
The volunteer teams have been working hard to prepare for the eclipse. Each team has been asked to practice capturing images for both solar and lunar observation. The purpose of these practice sessions is to become familiar with the equipment, the software used to capture the images, and the observation protocol.
The complete hardware and software setup consists of the following:
- Windows 10 laptop
- 5 megapixel Point Grey (FLIR) USB3 monochrome camera (Grasshopper3 GS3-U3-51S5M with a 12-bit Sony IMX250 sensor and 2448 x 2048 resolution)
- Daystar 80 mm diameter, 500 mm focal length refractor telescope
- Celestron CG4 equatorial mount with motor drives for tracking
- Arduino microcontroller
- GPS module with antenna
- MATLAB Solar Eclipse App
The software is described in the sections below.
Hi, this is Andrei. As a volunteer for the Citizen CATE Experiment, I wrote the MATLAB Solar Eclipse App which will be used by the volunteer teams to capture the August 2017 total solar eclipse at each observation site on the totality path.
The software is a MATLAB app and uses functionality from Image Acquisition Toolbox, Image Processing Toolbox, and Parallel Computing Toolbox. A standalone executable application, built with MATLAB Compiler, has been installed on each of the laptop computers used by the volunteer teams. The app user interface is workflow-oriented based on the solar observation protocol put together by the Citizen CATE scientists. The graphical user interface consists of multiple tabs, each of them focused on a specific task, such as Alignment, Focus, Calibration, and Totality. As an example, here is a screenshot of the Focus tab, which is used to fine-tune the telescope focus:
A live view of the video stream will be captured by the telescope camera which is connected to the laptop via a USB3 cable. The live view will allow the volunteers to zoom in on a region of interest in order to fine tune the telescope focus. Image Acquisition Toolbox videoinput functionality will control the camera, transfer the acquired images into the MATLAB workspace, and provide a preview. A live image histogram, calculated with the imhist function in Image Processing Toolbox, will provide visual feedback for optimizing the exposure time. In the histogram, the x-axis shows the pixel intensity values (0-65535 for 16-bit scaled data) and the y-axis shows the number of pixels at each pixel value. The image histogram shows if there are overexposed pixels and will allow the volunteers to choose an appropriate exposure time. A focus quality indicator and a line profile will provide visual feedback for fine-tuning the telescope focus.
During the 2-3 minutes of totality, the camera will capture a stream of images of the Sun's corona. Because the corona's brightness can vary significantly as you move away from the Sun's surface, high dynamic range (HDR) images are required to cover its full brightness range. The camera's exposure time will be controlled by a varying pulse-width TTL signal, output by the Arduino as shown in the picture below. Those exposure times are 0.4, 1.3, 4, 13, 40, 130, 400, and 1300 milliseconds. The multi-exposure sequence will be continuously repeated during totality. Each sequence will then be combined into a single HDR image. For 2.5 minutes of totality, each volunteer site will create about 75 HDR images.
This might be a complicated programming exercise in other languages but in MATLAB it can be achieved with a few lines of code:
v = videoinput('pointgrey', 1, 'F7_Mono16_2448x2048_Mode7'); triggerconfig(v, 'hardware', 'risingEdge', 'externalTriggerMode1-Source0'); start(v) frames = getdata(v); montage(frames(:,:,:,1:8), 'Size', [2 4])
Here is a series of images from a recent lunar observation practice session.
Acquired image frames are saved to disk as TIF format files using the imwrite function:
for ii = 1:size(frames,4) filename = sprintf('frame_%d.tif', ii); imwrite(frames(:,:,:,ii), filename, 'tiff'); end
In the app, we do not want the image saving operation to delay the execution of other code. Saving to disk is done in a parallel worker using the parfeval function from Parallel Computing Toolbox. For an example of how to simultaneously acquire images and save them to disk, see this MATLAB Answers post.
While acquiring the images, the software also logs the frame timestamps. In order to synchronize the timestamps from different observation sites, the software gets GPS time information from the GPS module. The GPS module is transmitting NMEA strings (lines of ASCII text) to the computer via a virtual COM port, as in the example below. The NMEA sentences contain information such as the GPS date and time, latitude, and longitude.
$GNZDA,180755.000,20,07,2017,,*47 $GPGGA,180755.000,4217.9848,N,07121.0476,W,1,18,0.6,83.6,M,-33.8,M,,0000*5A $GNRMC,180755.000,A,4217.9848,N,07121.0476,W,0.00,0.03,200717,,,A*61
We make use of the MATLAB serial function to communicate with and transfer data from the GPS module. For an example of how to log GPS NMEA strings to a text file, see this MATLAB Answers post .
After the eclipse, I will post the app to the MATLAB File Exchange for anyone who wants to see how it all works.
After totality ends, the multiple exposure images taken during the eclipse will be processed to create a set of high dynamic range (HDR) images as described above. To create a high-dynamic range image from a multi-exposure image sequence, we use the makehdr function in Image Processing Toolbox. The tonemap function is used to compress the HDR image for viewing on a computer screen, which has a much lower dynamic range.
filenames = cellstr("frame_" + (1:8) + ".tif"); exposures = [0.4 1.3 4 13 40 130 400 1300]; hdr = makehdr(filenames,'RelativeExposure',exposures/exposures(3),... 'MinimumLimit',1000,'MaximumLimit',54000); hdr8 = tonemap(hdr); imshow(hdr8);
Here is an HDR image which was obtained from combining a cycle of 8 multiple exposure images from a lunar practice session and another one created during the March 2016 eclipse in Indonesia.
After the eclipse, all the data taken by the volunteer teams will be uploaded to a server at the National Solar Observatory. The HDR images from all volunteer sites will then be put together to create a 90 minute video of totality.
If you have an interest in amateur Astronomy or have any questions or comments about the Citizen CATE experiment, please let us know here.
That's all for now. In the last post in this series, I'll tell you about the day of the eclipse. I'll describe what happened during totality and show you some images of the corona taken on that day.
One final note. The project sponsors and volunteers have prepared very carefully for the big event. Working with the volunteer teams, they've tried to account for all contingencies. However, there is one variable that we cannot control - weather. We just have to hope that mother nature cooperates on that day.