StormSurgeViz

A MATLAB-based tool for visualization and analysis of CF-compliant storm surge and inundation model output

Joining the StormSurgeViz Collection

StormSurgeViz is a MATLAB-based analysis and vizualization application for storm surge model output, on either structured/rectangular (CGRID) or unstructured/non-rectangular horizontal (UGRID) grids. The key to this flexibility is conformance of model output files to requirements that enables StormSurgeViz to recognize and process the model output in a consistent way. Currently, StormSurgeViz accesses and visualizes ADCIRC+SWAN model output, with draft extensions to handle rectangular CGRID model output. Examples of ADCIRC output can be found here.

There are 4 requirements that model output should conform to, in order for StormSurgeViz to ingest, process, and visualize the data.

  1. Files must be in NetCDF format.
  2. Dimension and variable attributes (metadata) must conform to the Climate and Forecast (CF) standards and conventions for metadata description. For model output already in NetCDF format and for which other client applications already handle that models output, it is possible to mark up a NetCDF file to make it appear to a client-side request that it is CF-compliant. This approach allows the native NetCDF file to remain unaltered for some applications, while providing CF compliance through an intermediate XML file using the NetCDF markup language NcML.
  3. Files must be posted on DAP-enabled data servers that can serve data requests via the OPeNDAP protocol.
  4. Creation and maintenance of a model output index on the data server.

These requirements are described below.


1) NetCDF Formatting

StormSurgeViz depends on model output stored in the NetCDF file format. NetCDF is a standard binary file format for array data maintained by the University Coorporation for Atmospheric Research (UCAR) and supported by the National Science Foundation (NSF). The NetCDF format facilitates global attributes, mutltidimensional data arrays (variables), and variable dimensions and attributes.

NetCDF files may be read and written by free and open source programs (ncdump and ncgen) supplied by Unidata, a UCAR subsidiary, or by user-supplied programs that take advantage of software NetCDF libraries. Free and open source NetCDF libraries are available from both Unidata and other third parties. NetCDF libraries are available with application programming interfaces (API) in C, Java, Python, Perl, R, Fortran, and Matlab. The ncdump and ncgen programs from Unidata transform NetCDF files to and from human readable text representations of NetCDF using a description format known as Common Data Language (CDL).

Many models already output files in NetCDF format, due to the many benefits of having files in NetCDF. For models that do not natively output in NetCDF, there are two options:

  1. Instrument the model source code with NetCDF library calls to open, describe, and populate the files with model output variables.
  2. Write a post-processor that converts the native model output into NetCDF.

This latter option is usually easier. Either way, getting model otuput into NetCDF is beyond the scope of this document. We are happy, however, to provide examples of either approach.

2) Climate and Forecast (CF) Compliance

Climatologists, meteorologists, and oceanographers have come together over the past two decades to standardize a comprehensive naming convention for variables of interest and associated metadata. The standard, known as the Climate and Forecast (CF) Conventions, is universally accepted today and community-maintained. The CF Conventions are designed to take advantage of the NetCDF format.

Models generally output the solution (dependent variables) on a computational grid (independent variables) and along the temporal dimension. In NetCDF parlance for dynamic models, time is usually the "unlimited" dimension, meaning that it can grow by appending new time levels to the NetCDF file. The names and formats for specifying the variables and dimensions vary from model to model. For a tool such as StormSurgeViz to recognize variables of interest and correctly locate them in space and time, conventions for naming and describing the variables are necessary. By requiring conformance to naming conventions, software can access and properly display model output without error-prone "guessing" of metadata.

CF takes an interesting approach to naming conventions. Rather than requiring specific variable names, presecribed lists of metadata are associated with each variable so that tools may derive the meaning of each variable from the associated metadata. Metadata consists of pairs, called "attributes," of both a name and a value. CF prescribes allowable attribute names and the vocabulary of attribute values for a given attribute name. Some attributes are required and some are optional.

StormSurgeViz requires a minimum degree of CF-compliance. This document does not describe CF in detail; it outlines the parts of CF conventions necessary for StormSurgeViz to operate in a community and interoperable way.

CF specifies some attributes, known as "global" attributes, which apply to all variables contained in the model output. These attributes, such as the title and Conventions attributes, are placed in the model output only once and are described in the CF Conventions here.

CF recommends several attributes for individual variables rather than all variables contained in a dataset globally. The most useful among these and required for StormSurgeViz is standard_name. The CF lists a vocabulary of acceptable values for standard name here. StormSurgeViz searches model output for CF standard names in order to identify variables of interest listed below.

Two other attributes are associated with a given standard name value, long_name and units. Long names are free form text descriptions and are not standardized for particular standard names. Units are values taken from the UDUNITS software package. Particular standard names have prescribed lists of acceptable units values.

Coordinate data (latitude, longitude, vertical coordinate, and time) have special treatment within the CF standard. Coordinate data are identified by an axis attribute with one of the values X, Y, Z, or T associated with variables typically named x, y, z, and time. Applicable standard names are latitude, longitude, time, and depth or height.

2a) CGRID-specific Requirements For CF Compliance

Structured grid (CGRID) model output will require the following CF attributes for processing by StormSurgeViz:

Global Attributes

  • Convention - The value should be "CF-1.6" which may be modified to reflect a more recent version of the CF conventions as specified here.

Variable Attributes

There are two types of variables with affected attributes: coordinate variable arrays which provide coordinate data for corresponding dimensional indices, and data variable arrays which provide observed or modeled data which is dimensionally indexed. All cooredinate variables are required. Data variables are optional to the degree that StormSurgeViz will only display the variables included in the NetCDF.

Coordinate Variables
  • time - A required dimensionally-unlimited coordinate variable named "time" is a 1D array of 64-bit Reals and should have the following required attributes as partially described here:
    • standard_name - The value should be "time"
    • long_name - The recommended value is "model_time"
    • units = The value should be "s"
    • base_date - The value should be "YYYY-MM-DD HH:MM:SS -HH:MM" in the date/time stamp format specified by udunits and relative to Coordinated Universal Time (UTC). All time variable data are relative to this date/time stamp.
  • x - A required dimensional coordinate variable named "x" is a 1D array of 64-bit Reals and should have the following required attributes as described here:
    • standard_name - The value should be "longitude"
    • long_name - The recommended value is "longitude"
    • units - The value should be "degrees_east"
    • positive - The value should be "east"
  • y - A required dimensional coordinate variable named "y" is a 1D array of 64-bit Reals and should have the following required attributes as described here:
    • standard_name - The value should be "latitude"
    • long_name - The recommended value is "latitude"
    • units - The value should be "degrees_north"
    • positive - The value should be "north"
Data Variable Attributes
  • significant wave height - A data variable preferably named "significant_wave_height" is a 3D array of 64-bit Reals and should have the following required attributes:
    • standard_name - The value should be "sea_surface_wave_significant_height"
    • long_name - The recommended value is "significant wave height"
    • units - The value should be "m"
    • coordinates - The value should be "time y x"
    • _FillValue = The recommended value is "-99999.0"
  • sea surface height - A data variable preferably named "zeta" is a 3D array of 64-bit Reals and should have the following required attributes:
    • standard_name - The value should be "sea_surface_height_above_geoid"
    • long_name - The recommended value is "water surface elevation above geoid"
    • units - The value should be "m"
    • coordinates - The value should be "time y x"
    • _FillValue = The recommended value is "-99999.0"

2b) UGRID Extensions to CF

If the model output is represented on an unstructured grid, StormSurgeViz recognizes a proposed extension to the CF Conventions known as UGRID.

2c) Using NcML for CF-compliance

NetCDF files with metadata that does not meet CF standards and conventions can be "marked up" so that they are compliant. The NetCDF Markup Language (NcML) provides an XML representation of the NetCDF file, and client applications can use this marked up version. We typically use NcML for exactly this purpose: making non-compliant NetCDF file into compliant NetCDF files.

An example of an NcML file that makes a non-compliant ADCIRC file compliant is shown below. In this example, the ADCIRC NetCDF file variables adcirc_mesh and element are being augmented to include the cf_role, start_index, and long_name attributes, among others. Additional information is being added to the global attributes segment of the NetCDF file, such as title, Concentions, and cdm_data_type.

<netcdf xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2">
  <attribute name="title" type="string" value="ASGS NCFS Sandy Advisory 31"/>
  <attribute name="Conventions" value="CF-1.6, UGRID-0.9" />
  <attribute name="id" value="UND_ADCIRC.ASGS.NCFS.Sandy.31" />
  <attribute name="cdm_data_type" value="ugrid" />
  <attribute name="summary" value="The ADCIRC Surge Guidance System Forecast for Hurricane Sandy (2012) Advisory 31 on the nc6b mesh." />

  <variable name="adcirc_mesh" shape="mesh" type="int">
    <attribute name="long_name" value="mesh_topology" />
    <attribute name="node_coordinates" value="x y" />
    <attribute name="face_node_connectivity" value="element" />
    <attribute name="cf_role" value="mesh_topology" />
    <attribute name="topology_dimension" type="int" value="2" />
  </variable>
    <variable name="element" >
      <attribute name="long_name" value="element" />
      <attribute name="start_index" type="int" value="1" />
      <attribute name="units" value="1" />
      <attribute name="cf_role" value="face_node_connectivity" />
    </variable>

  <aggregation type="union">
    <scan location="." regExp=".*\.nc$" subdirs="false"/>
  </aggregation>

  <!-- aggregation type="union">
    <netcdf location="http://opendap.renci.org:1935/thredds/dodsC/ASGS/sandy/31/nc6b/blueridge.renci.org/ncfs/nhcConsensus/fort.63.nc"/>
    <netcdf location="http://opendap.renci.org:1935/thredds/dodsC/ASGS/sandy/31/nc6b/blueridge.renci.org/ncfs/nhcConsensus/swan_TPS.63.nc"/>
  </aggregation>
  -->
</netcdf>

3) OPeNDAP Data Servers and the DAP protocol.

StormSurgeViz may view NetCDF files stored on a filesystem local to the StormSurgeViz program itself. But more typically, storm surge model output stored in NetCDF is large and stored on remote servers. Moving large amounts of data over network connections is inefficient. Therefore an efficient transport protocol for NetCDF known as Data Access Protocol (DAP), formerly known as Distributed Ocean Data Protocol (DODS), was developed by NASA and is maintained by an NSF-supported non-profit, OpenDAP.org.

The DAP protocol allows NetCDF files to be queried for subsets of data by constraints along space and time dimensions. By requesting only necessary subsets of data, network transfers of data are minimized. Programs which make DAP requests are known as DAP clients and programs which provide responses to DAP requests are known as DAP servers. StormSurgeViz uses a DAP client in the NetCDF-Java library, via nctoolbox, a MATLAB toolbox for working with common model datasets.

There are several implementations of DAP servers. Unidata's THREDDS and OPenDAP.org's Hyrax are typically used in data centers and by local/laboratory data servers. Both work well with StyormSurgeViz. Hyrax is the StormSurgeViz recommended DAP server as it provides the most up-to-date standards compatibility and the best extensibility. The Hyrax server is a special type of DAP server which also implements a convention known as THREDDS. The THREDDS feature of a THREDDS-enabled DAP server provides a customizable catalog of dataset holdings on a DAP server and the DAP services available for each of those holdings.

4) StormSurgeViz Index

Storm surge model output are typically stored in NetCDF files on DAP servers with individual files for storm events, model ensembles, model runs, and possibly even individual variables. Output may also be served from more than one DAP server. StormSurgeViz requires an index of remote NetCDF resources available for viewing. These indices are constructed by running scripts on the data server that compile lists of DAP URLs corresponding to model output datasets. Construction of StormSurgeViz indices is described here.