Mashups: Leafletjs and OpenLayers map apps with Esri REST, XYZ Tiles & WMS

Leaflet openlayers_logo

I’ve been experimenting with OpenLayers v2 & mobile as well as Leafletjs map apps lately. I’m a little late to the game, but very enthused about the possibilities, as well as the simplicity of enabling JavaScript & jQuery capabilities.  Here are a few examples I’ve been playing with that mashup USGS The National Map data sources with OGC WMS and tilesets.

If you are not using USGS basemaps from The National Map in your US-centric map apps perhaps you should consider adding them on your next update.  TNM basemap caches cover the entire US and territories to 1:18000 scale. TNM dynamic map services contain a rich variety of data at higher resolution scales, and most importantly are available without charge from USGS.

Mashup #1: OpenLayers v2 with Esri REST Dynamic, Esri Tile Cache, XYZ Tile Cache & OGC WMS

OpenLayers v2 mashup example

OpenLayers v2 mashup example

This mashup is focused mainly on USGS The National Map Esri services, with an additional XYZ tileset published by National Park Service that is hosted on Amazon Web Services infrastructure and finally also integrating OGC WMS weather overlays from Iowa State’s Environmental Mesonet .  Google basemaps support this implementation. OpenLayers v2 is a sweet app that is robust in terms of mashing up different data sources but seems to have reached it’s peak in terms of modern, clean look.  If you’re looking to support mobile browsers, then you want to scroll down and look at Mashup #2.

Credits: OpenLayers examples


Mashup #2: OpenLayers Mobile/jQuery with Esri REST Dynamic, Esri Tile Cache + XYZ Tile Cache

OpenLayers Mobile w/ jQuery example

OpenLayers Mobile w/ jQuery example

Again, this mashup focuses mainly on USGS The National Map Esri services (Dynamic REST + Tile Cache) with the additional tileset published by National Park Service (hosted on Amazon Web Services) and uses BingMaps as the basemap of choice.  The mobile version of OpenLayers relies on jQuery for a fluid interaction that is easily visible and intuitive for mobile browsers.  It’s clean and modern looking, however takes forethought in planning out your mashup hierarchies.  This example splices in additional support for OpenLayers.Layer.ArcGIS93Rest functionality, in order to support Esri REST Dynamic map services.

CreditsOpenLayers examples and OpenLayers jQuery mobile


Mashup #3: Leafletjs with Esri REST Dynamic, Esri Tile Cache, XYZ Tile Cache, and OGC WMS

Leafletjs mashup example

Leafletjs mashup example

This mashup in my opinion is the cleanest, leanest OS map mashup app I have worked with so far.  The code is simple, the interface is clean and it can be used in both mobile and desktop browsers – using the same code.  I’ve created this example to demonstrate use of USGS The National Map tile cache basemaps and REST dynamic basemaps (published using Esri technology, as well as National Park Service XYZ tilesets (hosted on Amazon Web Services) and also integrated OGC WMS weather data from Iowa State.  I love this example, and plan to expand it further – including vector layers, GeoJSON, and additional CSS/HTML effects.

Credits: Leafletjs – Leaflet Tutorials


Batch clipping multiple rasters in QGIS – a very basic approach to a repetitive process

I have a current project that is based on precipitation data from the PRISM Climate Group – monthly Precipitation values (4km resolution grids).  My first task in my larger workflow with this data, which covers the entire United States, is to clip each of these monthly PPT grids (.asc) to the state of Colorado.  This equates to running 270 geoprocessing commands – clip each monthly grid to the state boundary and output to a file which (in my best practice) keeps the year/month designation in the output name.

There are certainly other, more advanced, ways to approach this task.  My approach is a longer path to finding this answer – note that it only applies to clipping N rasters to a single simple polygon. But, at least it’s one way to do this fairly large and repetitive task.

I have chosen to use the open-source QGIS/OSGeoW4 environment to do this task. My process steps are outlined below.

  • place all the rasters to be clipped in one directory (your workspace)
  • note the location of your clip polygon
  • in QGIS: Raster Menu > Extraction > Clipper — load your first raster, output and clip polygon. Then copy the full gdalwarp command from the box in the bottom of the clip GUI. Should look like this:

    gdalwarp -q -cutline D:/PeakGIS2/DATA/Colorado/STATE_WGS84.shp -crop_to_cutline -of GTiff D:/TEMP/US_PPT_4km/us_ppt_1980.10.asc D:/TEMP/CO_PPT_4km/PPT_CO_PRISM_1980_10.tif

Then, I created a spreadsheet with fields that break this command above into repeatable sections. In one of the fields, I list just the raster names, in the other fields i list my output names. Using excel helps to easily create numeric endings to your output raster name, like *Grid_1*, *Grid_2* etc… just drag the cell down and it will automatically enumerate the ending.

The fields may look like this: breaking gdalwarp command into sections

then I use excel’s (or OpenOffice Calc) concatenate functions to generate the gdalwarp full command for each of my input rasters – concatenating each field into one single field (in a separate worksheet for simplicity).

  • copy the output cells to a text file
  • save the text file to .bat (will look something like this): gdalwarp batch commands

I then use the OSGeo4W command line in QGIS and run this .bat file which contains only the ‘gdalwarp’ commands. This will run through each command until it has exhausted the batch, creating my desired output grids.

Like I said, it’s a long-winded way of doing the job and I am positive there are several steps that could be saved here – but at least its a beginner’s start and I hope easy to follow.



The Open-Source GIS Workflow: Raster Calculations with QGIS

To follow up on my last post, I provided an overview of a GIS workflow that was somewhat complex but still less expensive that using only Esri tools from start to finish. But after reading this article, I figured that I needed to try out the Open-Source workflow for comparison.

Using the same PRISM precipitation data (30-year Average by month) we downloaded before, I opened a new window of Quantum GIS (QGIS) Lisboa 1.8 and loaded the ASCII files.


I spot-checked to make sure the precipitation values came through, using the Identify Features tool. Once I was certain the precip values came through, I simply opened the Raster menu and clicked Raster Calculator.

The ‘Addition’ Raster Calulation is quite simple in QGIS: just create a statement that adds GRID1 + GRID2 + GRID3 and so on…  Then name your output grid and click OK.


The new grid is added to the TOC and you can apply a stretch by right-clicking and choosing “Stretch using Current Extent“. In the layer properties you can also change the grayscale stretch to other color-ramps: pseudo-color and (my favorite “freak out“).



From here, your grid is created and ready to go. You can take it a step further and convert the grid to shapefile (polygon) by going to the Raster menu again, choosing “Conversion” then “Polygonize“. Choose your raster to convert, and name the field for your outpuit gridded value (in this case I chose PTOTAL) to carry the Precipitation total value forward into the shapefile attribute table.


Easy, simple, quick, intuitive, FREE.  I QAQC’d the output grid from QGIS against the output grids from my Esri/Gobal Mapper workflow and a manual calculation- they all match exactly.  But this QGIS output seems a little bit sweeter, a little bit more rewarding… like sweet music to my ears considering the minimal effort it required from start to finish.


The Semi-Poor Man’s GIS Workflow: Raster Calculations using Global Mapper and Esri ArcView

NOTE: This is a workflow discussion using proprietary software (Esri ArcGIS Desktop and Global Mapper) to run raster calculations.  There are other tools to do this job (even free), most notably in the OpenSource Quantum GIS (QGIS) Package.


PRISMI’ve been working with a client to generate visuals of historical precipitation data, in order to help visualize patterns of change over the last 30 years, and help with future projections.  We started out with the Oregon State PRISM datasets (from the PRISM Climate Group) and wanted to generate winter precipitation maps in order to begin looking at 30-year averages.

In figuring out a workflow, we ran into the first hurdle which is the PRISM data is available as either 30-year average precipitation (Annual) or 30-year average precipitation (by Month). Since we only want to visualize the winter month precipitation, we could not use the Annual data and simply overlay it on a map.  We had some work to do in order to prepare the data using the monthly data – which would require us to run raster calculations (Addition) in order to total up the values for winter months only.

I’ve been using Esri ArcGIS since 2001 and also spend a significant amount of time in Global Mapper and QGIS for my daily GIS workflow needs.  Because of this, when I am working with rasters, many times I don’t need to invest in Esri’s additional (expensive) Analysis Extensions, when I can usually get the same analysis output with my simple Esri ArcView license and a copy of Global Mapper together, for a much lower cost. For this particular workflow, I am using Esri 9.3.1 View license with no extensions, and Global Mapper 14.2.

The task at hand is to take the 30-year average (by month) datasets from PRISM and generate a visual for winter precipitation – winter months only. This requires downloading the 1971-2000 Monthly Average (800m) data for the months of Oct/Nov/Dec/Jan/Feb/Mar/Apr/May. Once these are downloaded and unzipped to their native ASCII GRID formats, the next step is to generate a final output grid which contains precip totals over the winter months. This is where using both Global Mapper and Esri tools come in handy, for my particular workflow.

LoadGEMFirst I loaded all of the winter month ASC GRIDS into Global Mapper and displayed them accordingly. The intention is to add the precipitation values from October to November, and then add those with December, January, February and so on…

2_CombineIn Esri, you can easily do this IF you have the Spatial Analyst or 3D Analyst tools e nabled. I do not have those extensions purchased, so I am using Global Mapper for this particular workflow. In Global Mapper, I click the “Analysis” menu and then click the dropdown “Compare/Combine Terrain Layers…”

1_OptionsThis particular tool will allow me to add the values from one grid to another. In this case, the grids are the same resolution, geography and format, so this should be a simple operation. In the dialog box, I make sure to name the first combined grid “CombinedGrid1″ or something like that. Next, I choose “Addition” as the operation to perform (which adds the value from Grid 1 (October) to Grid 2 (November) and generates an output grid with the total for each grid cell.

Then I go back in and choose the same tool again: Analysis > Compare/Combine Terrain Layers… and then add my newly-generated output grid (CombinedGrid1) to Grid #3 (December). TIP: Make sure you choose the dropdown for Operation Type to be “ADDITION” again, because of you just whiz right through the dialog, it will automatically default to “Subtraction” which is the opposite of what we want to do (for this workflow).

Cycling through this Global Mapper stage of the workflow, adding each winter month to the previous total, I finally get my resulting output grid: The addition of precipitation values for the winter months of October-May in one final grid.

In Global Mapper, this resulting final grid is stored temporarily, so we need to take it and export to a “real” GRID format to move into the next step. To do this, Click on your final output (temporary) grid and then go to File > Export > Export Elevation Grid Format.

4_ExportGRIDIn this case, since I am going to bring the resulting output grid into Esri, I am choosing “Arc ASCII GRID as the format. I click OK, and then save my grid in my local workspace accordingly.

6_SaveNow, In Esri desktop I load the ASCII GRID and open ArcToolbox. My next step is to convert this ASCII GRID to an Integer Grid so that I can generate a final output of a shapefile, with the “VALUE” field being my total winter month precipitation number.

7_ConversionSo, in ArcToolbox (View License or higher), I expand the “Conversion Tools” and choose “To Raster > ASCII to RASTER”. I enter my input grid and the output name (saving as an Esri GRID at this point) and generate. Now I am ready to convert this grid to a shapefile, which will have an individual polygon for each grid cell, carrying the VALUE field which is my totaled average precipitation for the winter months during the 30 years of 1971-2000.

To do this, I once again go back to ArcToolbox > Conversion Tools. This time, I expand the FROM Raster toolset and choose “Raster to Polygon” I choose my output Esri GRID from the step above and name my output shapefile accordingly. In order to maintain the exact grid cell dimensions from my grid, I make sure to UNCHECK “Simplify Polygons” in the output polygon shapefile. I run the tool and my final shapefile output is ready for mapping and visuals.

11_Raster2PolygonNote: It’s always a good idea to run some random spot checks to make sure the final output is as expected – which means the raster-calculations (Addition) for each month carried through as expected.  To do this, I load the individual month grids plus the final output shapefile in Esri desktop, then use the Identify tool.  The check I use is to identify the VALUE for all of the grid layers over  specific grid cell, and then add them up using a calculator.  They should match the GRIDCODE attribute contained in the final output shapefile, for that specific polygon/grid cell.  If not, you have to go back and either start over, or review your interim grids created by Global Mapper to see where the issue may lie.


The resulting grid.

I hope this workflow helps you, especially if you don’t have Esri’s 3D or Spatial Analyst extensions, but still need to run raster calculations (Addition, Subtraction, Multiplication etc).

Brett J Marraccini, GISP

Making your WFS truly interoperable is a myth...or is it?

WFS and true Interoperability – is the key really GML Simple Features (GMLSF) Profile?

Making your WFS truly interoperable is a myth...or is it?

Making your WFS truly interoperable is a myth…or is it?

For quite some time I’ve been working with OGC Web Feature Services (WFS), trying to better understand the technology, protocols and use cases so that I can provide my clients sound advice when choosing an implementation for feature services. Finding a public-facing WFS to review isn’t difficult, but it isn’t easy either. When one of my clients approached me to determine the difference between public-facing WFS and those published internally using Esri technology, I began to run faster down the trail of recognizing the key differences between the two. What I may have found is that there is a specific support parameter called GML Simple Features Profile that may be the key to publishing a WFS that is truly compatible with various desktop applications, regardless if you’re using Esri, MapServer, GeoServer or another application to host the web service.

I welcome your input and thoughts on my hypothesis.  Below is a summary of my findings to date, and I am happy to share this info with you.

Let’s start with the PUBLIC WFS links I used for review:

Norwegian Petroleum Directive (EPSG 4326) GetCapabilities | DescribeFeatureType | GetFeature

DM Solutions Canada (EPSG 42304) GetCapabilities | DescribeFeatureType | GetFeature

Oklahoma GIS (EPSG 4326) GetCapabilities | DescribeFeatureType | GetFeature

Arizona Geologic Map (EPSG 26912) GetCapabilities | DescribeFeatureType | GetFeature

When I started looking at the differences, I compiled the Get/Describe xml for each and tried to parse through the code to determine if there were functionality, schemas or versions that would make a huge difference on why one was supported in OpenSource desktop applications like QGIS and Gaia but not Esri Desktop – or vice versa.  Sure there are subtle differences but what I found was that there was no consistency between the different services that stood out as the key identifier.

Digging deeper I found a couple forum articles here on Esri and StackExchange that were focused on something a bit nondescript to my newbie eyes: the GML Simple Features Profile.

The question in these forums is whether Esri can handle GML Simple Features Profile.  A way to determine if the WFS you are reviewing supports this is to look for the following code in your GetCapabilities response (example below), specifically:

<ows Parameter name=”outputFormat”>

which should specify:

<ows:Value>text/xml; subType=gml/3.1.1/profiles/gmlsf/1.0.0/0</ows:Value> or similar, with the “gmlsf” being the key.

How do you really find this particular parameter?  Well, the easiest way to is form your GetCapabilities request like this:

If you form it like the one below, then the OWS DescribeFeatureType Parameter will not display:

In order to evaluate this question about GML Simple Features, I tested the four PUBLIC WFS noted above in the following Desktop GIS platforms:

  • Esri – ArcGIS 9.3.1
  • Esri – ArcGIS 10.x
  • Quantum GIS  (QGIS) 1.8
  • The Carbon Project – Gaia 3.4
  • Schlumberger – Petrel 2012.5
  • Blue Marble – Global Mapper 14.2

I found that only one of the sample PUBLIC WFS above worked in ALL of the platforms above: The Arizona Geologic Map WFS published by Arizona Geological Survey.  The rest were inconsistent across the board.  This could be due to configuration, source of data or otherwise, but I was able to unearth a single specific key element in the AZ Geologic Map WFS that the others didn’t have – support for GML Simple Features.

In the GetCapabilities request against the Arizona Geological Map WFS, the parameters supporting GML Simple Features is highlighted below.

Testing against the other WFS (DM Solutions, Oklahoma GIS, Norwegian Petroleum Directive) this parameter existed, but it not contain support for GMLSF, or it didn’t exist at all.  An example of this is in the DM Solutions Canada WFS GetCapabilities:

Note that there is no support for GMLSF in this case.

The Arizona Geologic Map WFS is the only PUBLIC WFS I found that is currently supported in all 6 of the desktop GIS and Geologic mapping platforms I tested.  The question remained – what is it about GMLSF that is so important for interoperability? A couple reasons below:

1. we all know that a significant amount of WFS are published using Esri technology, and Esri themselves have published a recommendation for adding GML Simple Feature support:

ESRI recommends use of the GML Simple Features profile for open information exchange. For users to openly share information across systems using an open profile, the chosen profile must be adopted by many organizations. The GML Simple Features profile was designed as a common format for interoperability by many GIS software organizations that are committed to providing support. We recommend the use of GML Simple Features where possible. ArcGIS includes direct converters to read, write, and serve GML Simple Features.


2. OGC’s GML Simple Features Profile Implementation Standard sums up the need and reasoning for GMLSF as a means to support interoperability without heavy software investment:

The GML specification declares a large number of XML elements and attributes meant to support a wide variety of capabilities. For example, the GML specification can encode dynamic features, spatial and temporal topology, complex geometric property types and coverages. With such a wide scope, interoperability can only be achieved by defining profiles of GML that deal with a restricted subset of GML capabilities. Such profiles limit the number of GML object types that can appear in compliant schemas and consequently are easier to process.

….. This profile defines a restricted but useful subset of XML-Schema and GML to lower the “implementation bar” of time and resources required for an organization to commit for developing software that supports GML. It is hoped that by lowering the effort required to manipulate XML encoded feature data, organizations will be encouraged to invest more time and effort to take greater advantage of GML’s rich functionality.

Development of this profile does not reduce the need for distinct communities of users to develop application schemas (data models) for information sharing. However, to the extent that users’ application schemas fit within the scope of GML-SF capabilities, this profile facilitates the ability to use WFS for interoperable feature data exchange with much less software development investment.

I am positive there are other reasons why GML Simple Features Profile can be a significant key to WFS interoperability in a variety or dekstops, and I’d love to hear about them in the comments. It’s from this finding that I am encourged to make a recommendation to my client (who is using Esri technology, among others, to publish WFS) to support the GML Simple Features Profile in order to successfully reach a wide variety of desktop applications which can consume OGC WFS.

Author’s note: Special thanks to the following web resources which were used to gain both knowledge and insight about this particular question: