Licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Attachment | Size |
---|---|
![]() | 262.01 KB |
Hurricane Irene, which hit the East Coast of the U.S. over the weekend of August 27 - 28 2011, produced large amounts of flooding in some inland areas such as the state of Vermont (its worst natural disaster since 1927), while leaving other areas such as the Connecticut River valley of Massachusetts relatively unscathed. The Berkshire Mountains of western Massachusetts received between 6 and 8 inches of rain from Irene. In particular, the picturesque town of Shelburne Falls on the Deerfield River, which averages 4.4 inches of rain for the entire month of August, had to be evacuated due to flooding.
There are already lots of maps out there depicting Hurricane Irene and its effects, such as this New York Times map, but many of us would like to play with the data itself. Prompted by a question about local precipitation data, I located downloadable shapefiles from the National Weather Service's Advanced Hydrologic Prediction Service and put it together with storm track information from the National Hurricane Center to produce this map:
The precipitation data from the National Weather Service is provided as point shapefiles on a 4.3-Km grid (rather than as rasters, possibly because this lets them leave out points with zero precipitation). The files with the shortest time period are for the "hydrologic day", ending at 12 Noon GMT. Two days of data were merged with ArcToolbox > Data Management Tools > General > Merge, and then the precipitation from coincident locations (sharing unique IDs and coordinates) were summed using ArcToolbox > Data Management Tools > Generalization > Dissolve (note that this short-changes North Carolina totals). The hurricane data is provided as a Google Earth KML document, which was converted to a table using GPS Visualizer and imported. The hurricane symbols are built from Character Marker Symbols using ESRI's Climate and Precipitation and Geometric Symbols fonts.
For the past year and a half, my department, Academic Technology Services, has been working on a mapping project that we call Cityscapes. It's a “Web 2.0” tool to allow students to collaborate in their studies of urban neighborhoods, where geography should be an organizing theme. Think of Google Maps, then think of groups of students adding their own location markers and decorating them with photos, videos, and blogs.
The two sites we've created so far can be seen here:
http://www.ats.amherst.edu/tokyodemo
http://www.ats.amherst.edu/parisdemo
The Tokyo site is not completely open due to copyright considerations; if you would like an account, contact me.
My part of this project was preparing the historical maps that you see in the image below. This included georeferencing them but also turning them into properly positioned Google tiles.
Attachment | Size |
---|---|
![]() | 1.15 MB |
I have recently been writing my first script to handle a geoprocessing task related to our Cityscapes project, where we place georeferenced historical maps in a Google Maps framework that can be used in urban studies classes (more to come!). Since the beginning I have been converting the georeferenced maps into Google Maps tiles by hand, a process that takes a couple of hours. This is small compared to the georeferencing time, so even though it was a prime candidate for scripting I had been putting it off in order to complete more maps.
For about a year now I have been learning the scripting language Python, as it has become the new, open way to script ArcGIS (and also has many fans in other areas of scientific computing). As my summer Cityscapes efforts ended, and as I prepare for my Cityscapes presentation at the NEArc meeting on November 8, my education has finally shifted over to using Python with ArcGIS (9.3). The documentation I have been using includes the following items:
The script is being implemented through ArcToolbox, which makes it easier to provide script parameters, such as the input raster, through a GUI dialog. But because I want it to be usable by others, I have to write it in a general way, which means displaying information about the input raster, suggesting reasonable values for subsequent inputs, and checking for illegal values. Hence I have been learning about the ToolValidator class, apparently a new feature in version 9.3, and I've been struggling to understand some undocumented characteristics. Some of my observations follow.
First off, the ToolValidator class is only very loosely connected to its Tool dialog. In particular, it is not instantiated just once when the dialog opens, which would allow the preservation of state within the class as the user changes the dialog, but instead every time the dialog parameters are changed, and before each of its methods are called. Hence the dialog parameters themselves are the only way to store information as the user fills in the fields and they are validated.
The ToolValidator template provided for each script accesses the dialog parameters in its initialization method:
def __init__(self):
import arcgisscripting as ARC
self.GP = ARC.create(9.3)
self.params = self.GP.GetParameterInfo()
The parameter values themselves are accessed through, for example, self.params[2].Value
, but be forewarned that sometimes the property Value
is a Value object, rather than just a string of characters (poor documentation on this is discussed here). This seems to be the case when the parameter refers to a file or other data structure, and if you want its name as a character string you must reference the property
self.params[0].Value.Value
or use the expression
str( self.params[0].Value )
The latter conversion will occur automatically in contexts requiring a string, such as concatenation.
When a dialog is first opened, the ToolValidator method initializeParameters()
is called. It can be used to insert initial values into the parameter fields, and also set some characteristics of those fields in the dialog. For example, to disable a parameter field (make it uneditable), you can include code such as:
self.params[2].Enabled = False
When a dialog field has been modified (which may include when the dialog first opens), verification that a parameter value is usable occurs at several places:
updateParameters()
comes next, and can be used to verify many characteristics of input values and avoid processing bad ones. It's also possible to correct the values directly, but in most cases it's better to send a message to the user as described in step 4 and let them correct them.updateMessages()
is called, where the messages returned from step 3 can be inspected and, if you want, reset with your own. These messages can direct users to fix their errors.The result of the separation of steps 2 and 4 is that sometimes the same tests must take place, first to avoid processing bad data and second to send a message about it.
The ToolValidator method updateParameters()
can be used to process new parameter values (e.g. to set related parameters). No information is provided about which of the dialog fields was modified, though, so you must check each parameter of interest to see if it is the one that needs to be validated:
if not self.params[0].HasBeenValidated :
…
elif not self.params[1].HasBeenValidated :
…
Once you find the modified parameter, you can ignore the update if it's due to the script itself (e.g. initialization) rather than being altered by the user:
if not self.params[1].altered : return
Before processing an input dataset, you should test for its existence:
if not self.GP.Exists( str( self.params[0] ) ) : return
If the data isn't actually there, this method will raise an error, but it won't be reported until later, hence the need to return
at this point.
The ToolValidator method updateMessages()
can be used to send the user a message about problems with the value in a dialog field. The result is a red ⊗ next to the input field, and clicking on it will display the message. Once again no information is provided about which of the dialog fields has the problem, so you must check each parameter of interest (even if this was done previously), e.g.:
if self.params[1].Value and self.params[1].Value <= 0 :
self.params[1].ClearMessage()
self.params[1].SetErrorMessage("This parameter must be positive.")
Providing basic documentation for the tool and its parameters is fairly easy, once you know that the simplest way to create and modify them is with the Documentation Editor. The ArcGIS documentation provides two options for opening the Editor, the first of which doesn't work, the second of which is available by opening ArcCatalog and menuing View > Toolbars > Metadata to bring up that toolbar. Then click on the script, click on its tab Metadata, and finally, in the Metadata toolbar, click on the button Edit metadata.
When the Documentation Editor opens, the left side lists the various types of information you can provide. The main tool description, visible when it first opens, is added in the section General Information, in the item Abstract. The various parameters are described in the section Help in the item Parameters, and they will all be listed to the right under Contents. Open the one you want to describe, click on Dialog Reference, click on the [A] Paragraph button, and then click in the box to the right to add text.
Warning: if you change a parameter name after creating a description for it, the latter will be lost!
The planet Mars is currently enjoying its biannual dalliance with the Earth. Marching through the constellation Cancer, the Red Planet's nearness and opposition to the Sun produce an unusually bright spessartine gem against the black velvet sky. It now hangs above the eastern horizon in the evening, where it will remain for several months.
Mars has been the subject of intense human scrutiny for centuries, and a battery of NASA probes have visited the planet since the dawn of the space age. The first wave began in the 1960s with the Mariner flybys and orbiters, and ended in the early 1980s with the Viking orbiters and landers. Starting in the late 1990s, a new series of sophisticated craft invaded Mars. Probably most famous are the three robotic rovers deployed to directly explore and analyze its surface (Sojourner, Spirit, and Opportunity). Of more interest for geography are the three orbiting satellites that have remotely imaged the entirety of Mars, returning unprecedented amounts of information and transforming our understanding of the planet.
The Mars Global Surveyor, operating from 1999 to 2006, carried a number of scientific instruments, including a high-resolution camera with a horizontal resolution as small as 50 cm, a laser altimeter with a vertical resolution of 30 cm, a thermal emission spectrometer, and a magnetometer, and provided evidence for relatively recent water flow!
Mars Odyssey began studying the material composition of the planet in 2002, using another thermal emission imaging system and a gamma ray spectrometer, and confirmed the existence of subsurface frozen water.
The Mars Reconnaissance Orbiter reached Mars in 2006, and includes a telescopic camera with a resolution around half a meter per pixel. Using stereoimaging, improved measurements of elevation are also being calculated.
Currently the highest resolution topographic data is only available in certain areas. There is so much data that it isn't being processed in its entirety, but only where scientists have been particularly interested!
The best global topographic data currently available comes from the Mars Global Surveyor's Mars Orbiter Laser Altimeter (MOLA), which covers the planet in 1/128th-degree (28 arc second) measurements (and four times more detailed in the polar regions). For example, this "slice" of the planet extends from the north pole on the left to the south pole on the right along the prime meridian, revealing the generally higher elevations in the southern hemisphere:
One representation of MOLA data as a Digital Elevation Model (DEM) is in the Planetary Data System (PDS) IMG raster format, which can be viewed with the NASAView software:
http://starbeam.jpl.nasa.gov/tools/nasa-view.shtml
PDS/IMG is somewhat obscure; NASAView can save it as GIF or JPEG format, which is one way to quickly use it in other tools, but a lot of information is lost. Specifically, Martian elevation varies from -8,068 m to +21,134 m (a larger range than on Earth!), so the data is stored as 16-bit signed integers, while GIF and JPEG can only handle 8-bit unsigned integers (grayscale), forcing the output to be scaled and offset into values from 0 to 255.
Not surprisingly, the specialized image analysis program ENVI understands PDS/IMG as an external/generic format, which can be opened and resaved in another format such as ESRI's Grid for use with ArcGIS.
More generally, PDS/IMG is actually a pretty simple format, being a file of uncompressed data with a separate human-readable label file (.lbl) describing its layout. The data can be purely sequential or interleaved (if it's color/multi-band). It's therefore the same as what ESRI ArcGIS and IDL-VIS ENVI call Band Sequential (.bsq), Band Interleaved by Pixel (.bip) or Band Interleaved by Line (.bil). (Note that these formats are identical for grayscale/single-band data.) The only difference, it appears, is that ArcGIS expects a "header" file (.hdr) with a different layout of information. So only the latter file needs to be created, and the data file can simply have its file extension changed. A basic file is described below, but there is also a detailed description of the .hdr format here:
http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?topicname=BIL,_BIP,_and_BSQ_raster_files
The procedure to view the Mars data is as follows:
GEOGCS["ACS_Mars_MGS_MOLA",DATUM["D_<custom>",SPHEROID["_custom",3396000.0,0.0]],PRIMEM["Meridian Planum",-180.0],UNIT["Degree",0.017453292519943295]]
Corresponding Descriptions of Raster Characteristics | |
.hdr parameter | Corresponding .lbl parameter(s) |
---|---|
nrows 720 | lines = 720 |
| |
ncols 1440 | line_samples = 1440 |
| |
skipbytes 0 | record_bytes = 2880 |
| |
nbands 1 | bands = 1 |
| |
nbits 16 | sample_bits = 16 |
| |
pixeltype unsignedint | sample_type = *_unsigned_integer |
| |
byteorder I | sample_type = lsb_* |
| |
layout BSQ | band_storage_type = band_sequential band_storage_type = sample_interleaved band_storage_type = line_interleaved |
| |
nodata -3.4028226550889045e+38 | missing_constant = 16#FF7FFFFB# |
xdim 0.25 | map_resolution = 4.0 <PIX/DEG> |
| |
ydim 0.25 | map_resolution = 4.0 <PIX/DEG> |
| |
ulxmap 0.125 | westernmost_longitude = 0 <DEG> |
| |
ulymap 89.875 | maximum_latitude = 90 <DEG> |
| |
Properties
dialog, scroll down to Statistics
, click on the button Options
, and select Build Statistics…
.Tools
=> Extensions…
, and turn on 3D Analyst
. Then menu View
=> Toolbars
=> 3D Analyst
.3D Analyst
toolbar, click on the button Interpolate Line
, and create a polyline along the path where you want to see the profile, by clicking once at each vertex and ending with a double-click. Then click on the button Create Profile Graph
.An alternative source of data is the Precision Experiment Data Records (PEDR), a point set (Digital Terrain Model/DTM) from which the raster DEM is derived:
http://pds-geosciences.wustl.edu/missions/mgs/pedr.html
As noted above, there are also some bits and pieces of newer, high-resolution Mars DTM available from the Mars Reconnaissance Orbiter, available here:
This data has an embedded header, which ArcGIS 9.3 is able to partially recognize, but it makes a few mistakes along the way. In particular:
PROJCS["Plate_Carree",GEOGCS["EQUIRECTANGULAR MARS",
DATUM["D_MARS",SPHEROID["MARS",3393830.0,0.0]],PRIMEM["Reference_Meridian",-180.0],
UNIT["Degree",0.0174532925199433]],PROJECTION["Plate_Carree"],
PARAMETER["false_easting",0.0],PARAMETER["false_northing",0.0],
PARAMETER["central_meridian",341.0599975585938],
PARAMETER["Standard_Parallel_1",20.0],UNIT["Millimeter",0.001]]
Standard_Parallel_1
and the Central_Meridian
, but also the spheroid radius.NoData as:
, assign the following value: -3.4028226550889045e+38
.The cleanest way to import this data is to use a utility like tail (standard on Unix, an add-on for Windows) to remove the label from the beginning of the data file; with properly constructed .hdr and .prj files as described above, the data is ready to use.
An alternative, more black-box approach is described at ISIS support.
Who knew? Zombie Chicken Power!