Hurricane Irene, which hit the East Coast of the U.S. over the weekend of August 27 - 28 2011, produced large amounts of flooding in some inland areas such as the state of Vermont (its worst natural disaster since 1927), while leaving other areas such as the Connecticut River valley of Massachusetts relatively unscathed. The Berkshire Mountains of western Massachusetts received between 6 and 8 inches of rain from Irene. In particular, the picturesque town of Shelburne Falls on the Deerfield River, which averages 4.4 inches of rain for the entire month of August, had to be evacuated due to flooding.
There are already lots of maps out there depicting Hurricane Irene and its effects, such as this New York Times map, but many of us would like to play with the data itself. Prompted by a question about local precipitation data, I located downloadable shapefiles from the National Weather Service's Advanced Hydrologic Prediction Service and put it together with storm track information from the National Hurricane Center to produce this map:
The precipitation data from the National Weather Service is provided as point shapefiles on a 4.3-Km grid (rather than as rasters, possibly because this lets them leave out points with zero precipitation). The files with the shortest time period are for the "hydrologic day", ending at 12 Noon GMT. Two days of data were merged with ArcToolbox > Data Management Tools > General > Merge, and then the precipitation from coincident locations (sharing unique IDs and coordinates) were summed using ArcToolbox > Data Management Tools > Generalization > Dissolve (note that this short-changes North Carolina totals). The hurricane data is provided as a Google Earth KML document, which was converted to a table using GPS Visualizer and imported. The hurricane symbols are built from Character Marker Symbols using ESRI's Climate and Precipitation and Geometric Symbols fonts.
Building Historical Maps for Cityscapes, An Online Discovery Tool for Urban and Cultural Studies
For the past year and a half, my department, Academic Technology Services, has been working on a mapping project that we call Cityscapes. It's a “Web 2.0” tool to allow students to collaborate in their studies of urban neighborhoods, where geography should be an organizing theme. Think of Google Maps, then think of groups of students adding their own location markers and decorating them with photos, videos, and blogs.
The two sites we've created so far can be seen here:
The Tokyo site is not completely open due to copyright considerations; if you would like an account, contact me.
My part of this project was preparing the historical maps that you see in the image below. This included georeferencing them but also turning them into properly positioned Google tiles.
Adventures in Scripting ArcGIS, Part 1: The Tool Dialog
I have recently been writing my first script to handle a geoprocessing task related to our Cityscapes project, where we place georeferenced historical maps in a Google Maps framework that can be used in urban studies classes (more to come!). Since the beginning I have been converting the georeferenced maps into Google Maps tiles by hand, a process that takes a couple of hours. This is small compared to the georeferencing time, so even though it was a prime candidate for scripting I had been putting it off in order to complete more maps.
For about a year now I have been learning the scripting language Python, as it has become the new, open way to script ArcGIS (and also has many fans in other areas of scientific computing). As my summer Cityscapes efforts ended, and as I prepare for my Cityscapes presentation at the NEArc meeting on November 8, my education has finally shifted over to using Python with ArcGIS (9.3). The documentation I have been using includes the following items:
- A nice, short overview from a course at San Francisco State University;
- A longer overview from ESRI, Writing Geoprocessing Scripts with ArcGIS;
- The ArcGIS Desktop Help Reference, in particular Writing Geoprocessing Scripts and The Geoprocessor Object.
- A chart of the ArcGIS Geoprocessing Model.
The script is being implemented through ArcToolbox, which makes it easier to provide script parameters, such as the input raster, through a GUI dialog. But because I want it to be usable by others, I have to write it in a general way, which means displaying information about the input raster, suggesting reasonable values for subsequent inputs, and checking for illegal values. Hence I have been learning about the ToolValidator class, apparently a new feature in version 9.3, and I've been struggling to understand some undocumented characteristics. Some of my observations follow.
The ToolValidator Class and Toolbox Dialogs
First off, the ToolValidator class is only very loosely connected to its Tool dialog. In particular, it is not instantiated just once when the dialog opens, which would allow the preservation of state within the class as the user changes the dialog, but instead every time the dialog parameters are changed, and before each of its methods are called. Hence the dialog parameters themselves are the only way to store information as the user fills in the fields and they are validated.
Accessing Tool Parameters
The ToolValidator template provided for each script accesses the dialog parameters in its initialization method:
import arcgisscripting as ARC
self.GP = ARC.create(9.3)
self.params = self.GP.GetParameterInfo()
The parameter values themselves are accessed through, for example,
self.params.Value, but be forewarned that sometimes the property
Value is a Value object, rather than just a string of characters (poor documentation on this is discussed here). This seems to be the case when the parameter refers to a file or other data structure, and if you want its name as a character string you must reference the property
or use the expression
str( self.params.Value )
The latter conversion will occur automatically in contexts requiring a string, such as concatenation.
The initializeParameters() Method
When a dialog is first opened, the ToolValidator method
initializeParameters() is called. It can be used to insert initial values into the parameter fields, and also set some characteristics of those fields in the dialog. For example, to disable a parameter field (make it uneditable), you can include code such as:
self.params.Enabled = False
When a dialog field has been modified (which may include when the dialog first opens), verification that a parameter value is usable occurs at several places:
- If the user enters an inappropriate value into a dialog, for example a letter when a number is expected or a nonexistent folder when one is required, the Toolbox provides some very basic validation based on the field type, beeping and preventing one from "tabbing" out of the field. The user can still click out of it, though; in the first example the field is reset, and in the second an error message is displayed (probably due to steps 3 & 4, though).
- The ToolValidator method
updateParameters()comes next, and can be used to verify many characteristics of input values and avoid processing bad ones. It's also possible to correct the values directly, but in most cases it's better to send a message to the user as described in step 4 and let them correct them.
- Next, the Geoprocessor performs a basic validation, I guess so that it can catch script errors. If any are found, it sets error messages that will display unless cleared in the next step.
- Finally, the ToolValidator method
updateMessages()is called, where the messages returned from step 3 can be inspected and, if you want, reset with your own. These messages can direct users to fix their errors.
The result of the separation of steps 2 and 4 is that sometimes the same tests must take place, first to avoid processing bad data and second to send a message about it.
The updateParameters() Method
The ToolValidator method
updateParameters() can be used to process new parameter values (e.g. to set related parameters). No information is provided about which of the dialog fields was modified, though, so you must check each parameter of interest to see if it is the one that needs to be validated:
if not self.params.HasBeenValidated :
elif not self.params.HasBeenValidated :
Once you find the modified parameter, you can ignore the update if it's due to the script itself (e.g. initialization) rather than being altered by the user:
if not self.params.altered : return
Before processing an input dataset, you should test for its existence:
if not self.GP.Exists( str( self.params ) ) : return
If the data isn't actually there, this method will raise an error, but it won't be reported until later, hence the need to
return at this point.
The updateMessages() Method
The ToolValidator method
updateMessages() can be used to send the user a message about problems with the value in a dialog field. The result is a red ⊗ next to the input field, and clicking on it will display the message. Once again no information is provided about which of the dialog fields has the problem, so you must check each parameter of interest (even if this was done previously), e.g.:
if self.params.Value and self.params.Value <= 0 :
self.params.SetErrorMessage("This parameter must be positive.")
Providing basic documentation for the tool and its parameters is fairly easy, once you know that the simplest way to create and modify them is with the Documentation Editor. The ArcGIS documentation provides two options for opening the Editor, the first of which doesn't work, the second of which is available by opening ArcCatalog and menuing View > Toolbars > Metadata to bring up that toolbar. Then click on the script, click on its tab Metadata, and finally, in the Metadata toolbar, click on the button Edit metadata.
When the Documentation Editor opens, the left side lists the various types of information you can provide. The main tool description, visible when it first opens, is added in the section General Information, in the item Abstract. The various parameters are described in the section Help in the item Parameters, and they will all be listed to the right under Contents. Open the one you want to describe, click on Dialog Reference, click on the [A] Paragraph button, and then click in the box to the right to add text.
Warning: if you change a parameter name after creating a description for it, the latter will be lost!
A Close Encounter with Mars
The planet Mars is currently enjoying its biannual dalliance with the Earth. Marching through the constellation Cancer, the Red Planet's nearness and opposition to the Sun produce an unusually bright spessartine gem against the black velvet sky. It now hangs above the eastern horizon in the evening, where it will remain for several months.
Mars has been the subject of intense human scrutiny for centuries, and a battery of NASA probes have visited the planet since the dawn of the space age. The first wave began in the 1960s with the Mariner flybys and orbiters, and ended in the early 1980s with the Viking orbiters and landers. Starting in the late 1990s, a new series of sophisticated craft invaded Mars. Probably most famous are the three robotic rovers deployed to directly explore and analyze its surface (Sojourner, Spirit, and Opportunity). Of more interest for geography are the three orbiting satellites that have remotely imaged the entirety of Mars, returning unprecedented amounts of information and transforming our understanding of the planet.
The Mars Global Surveyor, operating from 1999 to 2006, carried a number of scientific instruments, including a high-resolution camera with a horizontal resolution as small as 50 cm, a laser altimeter with a vertical resolution of 30 cm, a thermal emission spectrometer, and a magnetometer, and provided evidence for relatively recent water flow!
Mars Odyssey began studying the material composition of the planet in 2002, using another thermal emission imaging system and a gamma ray spectrometer, and confirmed the existence of subsurface frozen water.
The Mars Reconnaissance Orbiter reached Mars in 2006, and includes a telescopic camera with a resolution around half a meter per pixel. Using stereoimaging, improved measurements of elevation are also being calculated.
Currently the highest resolution topographic data is only available in certain areas. There is so much data that it isn't being processed in its entirety, but only where scientists have been particularly interested!
The best global topographic data currently available comes from the Mars Global Surveyor's Mars Orbiter Laser Altimeter (MOLA), which covers the planet in 1/128th-degree (28 arc second) measurements (and four times more detailed in the polar regions). For example, this "slice" of the planet extends from the north pole on the left to the south pole on the right along the prime meridian, revealing the generally higher elevations in the southern hemisphere:
One representation of MOLA data as a Digital Elevation Model (DEM) is in the Planetary Data System (PDS) IMG raster format, which can be viewed with the NASAView software:
PDS/IMG is somewhat obscure; NASAView can save it as GIF or JPEG format, which is one way to quickly use it in other tools, but a lot of information is lost. Specifically, Martian elevation varies from -8,068 m to +21,134 m (a larger range than on Earth!), so the data is stored as 16-bit signed integers, while GIF and JPEG can only handle 8-bit unsigned integers (grayscale), forcing the output to be scaled and offset into values from 0 to 255.
Not surprisingly, the specialized image analysis program ENVI understands PDS/IMG as an external/generic format, which can be opened and resaved in another format such as ESRI's Grid for use with ArcGIS.
More generally, PDS/IMG is actually a pretty simple format, being a file of uncompressed data with a separate human-readable label file (.lbl) describing its layout. The data can be purely sequential or interleaved (if it's color/multi-band). It's therefore the same as what ESRI ArcGIS and IDL-VIS ENVI call Band Sequential (.bsq), Band Interleaved by Pixel (.bip) or Band Interleaved by Line (.bil). (Note that these formats are identical for grayscale/single-band data.) The only difference, it appears, is that ArcGIS expects a "header" file (.hdr) with a different layout of information. So only the latter file needs to be created, and the data file can simply have its file extension changed. A basic file is described below, but there is also a detailed description of the .hdr format here:
The procedure to view the Mars data is as follows:
- The MOLA data is stored here:
As an example, the lowest resolution topography can be studied by clicking on the link MEG004, downloading megt90n000cb.img and megt90n000cb.lbl, and renaming the former to megt90n000cb.bsq .
- Open the label file with a text editor and read its description. For these files, the data is sampled in geographic coordinates, 0.25°/pixel, and its values are referenced to an estimate of the "areoid" (like sea level but without the water!). The datum used is spherical, with a radius of 3396 Km. With this information you can create an "areographic coordinate system". Note that the Mars standard is to take the Prime Meridian at its Meridian Planum (coincidence?), and to measure longitude from 0° to 360°, which puts the Prime Meridian at the far left of the typical Mars image. In ArcGIS this means assigning the prime meridian to -180°.
The necessary projection file (e.g. megt90n000cb.prj) will therefore look like this one line:
- Using information from the label file, the header file (e.g. megt90n000cb.hdr) can be created with a text editor, and it should look something like the first column of this table (after the first row):
Corresponding Descriptions of Raster Characteristics .hdr parameter Corresponding .lbl parameter(s)
lines = 720
Number of rows of pixels in the image
line_samples = 1440
Number of pixels per row (columns) in the image
record_bytes = 2880
file_records = 720
lines = 720
Number of bytes of header information to skip over to get to the image data; it will be padded to the end of a row, so that it will always be an integer multiple of the number of bytes per row:
record_bytes * (file_records - lines)
bands = 1
Number of bands in each pixel (e.g. 1 for grayscale & 3 for RGB)
sample_bits = 16
Number of bits in each band (e.g. 32 bits for four-byte values)
sample_type = *_unsigned_integer
sample_type = *_signed_integer
sample_type = *_real
*represents a byte order (described next).
The default value is
sample_type = lsb_*
sample_type = pc_*
sample_type = msb_*
sample_type = sun_*
lsb= least significant byte first =
pc= Intel =>
msb= most significant byte first =
sun= Motorola =>
*represents a pixel type (described previously).
The default value is
band_storage_type = band_sequential
band_storage_type = sample_interleaved
band_storage_type = line_interleaved
These are the same when
bands = 1.
missing_constant = 16#FF7FFFFB#
map_resolution = 4.0 <PIX/DEG>
Width of an easting pixel in map units
= 1 / map_resolution
map_resolution = 4.0 <PIX/DEG>
Width of a northing pixel in map units
= 1 / map_resolution
westernmost_longitude = 0 <DEG>
map_resolution = 4.0 <PIX/DEG>
Map coordinate of the center of the first column
= westernmost_longitude + 1 / map_resolution / 2
maximum_latitude = 90 <DEG>
map_resolution = 4.0 <PIX/DEG>
Map coordinate of the center of the first row
= maximum_latitude - 1 / map_resolution / 2
This provides the same information as in a world file, so one isn't necessary for this raster.
- At this point you can preview the image in ArcCatalog! Optionally you can build statistics for the image, which will establish the maximum and minimum values and reduce the necessary color range, thereby enhancing the grayscale/color representation. In ArcCatalog, double-click on the data file (e.g. megt90n000cb.bsq) to open its
Propertiesdialog, scroll down to
Statistics, click on the button
Options, and select
- Finally, using 3D Analyst, create an elevation profile of an interesting location, e.g. the gigantic extinct volcano Olympus Mons. Add the raster you just created to ArcGIS, then menu
Extensions…, and turn on
3D Analyst. Then menu
3D Analysttoolbar, click on the button
Interpolate Line, and create a polyline along the path where you want to see the profile, by clicking once at each vertex and ending with a double-click. Then click on the button
Create Profile Graph.
The resulting profile for Olympus Mons is below:
An alternative source of data is the Precision Experiment Data Records (PEDR), a point set (Digital Terrain Model/DTM) from which the raster DEM is derived:
As noted above, there are also some bits and pieces of newer, high-resolution Mars DTM available from the Mars Reconnaissance Orbiter, available here:
This data has an embedded header, which ArcGIS 9.3 is able to partially recognize, but it makes a few mistakes along the way. In particular:
- ArcMap understands that the data has a projection of Equirectangular (Plate Carée), but it sets the linear unit incorrectly, and it misinterprets the value described as "CENTER_LATITUDE" as the origin of latitude, rather than as the standard parallel (secant line of the projection cylinder). Creating a projection file that looks like this will fix this:
Note that this data will vary from image to image; be sure to check not only
Central_Meridian, but also the spheroid radius.
- ArcMap doesn't properly interpret the NoData value, expressed as a hexadecimal value in the LBL file. To fix this, load the BSQ (renamed from IMG) file, and then export the layer as a TIFF file; in that dialog, in the field
NoData as:, assign the following value:
- Finally, Arc somehow sees a maximum value that's way too large. In the LBL file, the actual maximum is listed, and you can use that to change the symbology stretch to a proper maximum-minimum range.
The cleanest way to import this data is to use a utility like tail (standard on Unix, an add-on for Windows) to remove the label from the beginning of the data file; with properly constructed .hdr and .prj files as described above, the data is ready to use.
An alternative, more black-box approach is described at ISIS support.