You are here

Marin NHD Local Workflow - all steps in single post

Marin NHD Local

admin's picture




Workflow for Marin NHD Hydrologically Modeled Flow Line Development




Since this clip will be reviewed interactively, rebuild its pyramids and statistics using ArcToolbox Data Management Tools > Raster > Raster Properties > Build Pyramids and Statistics on the newly trimmed analysis surface. To set input to this tool it may be necessary to navigate using the folder button. Select your new tbsm1m_xxx_shed.img and leave checked both Build Pyramids and Calculate Statistics, while un-checking Skip Existing (10.1 only).
Use Environments... and expand Raster storage, choose BILINEAR for Pyramid resampling technique.

Properly built statistics and pyramids will enable effective use of the strongest stretch in display. After rebuilding statistics and pyramids, in Properies of the tbsm1m_xxx_shed.img visit the Symbology tab. In Stretch, Type select Histogram Equalize, then be certain to scroll down in the Properties dialog and in the Statistics drop-down list select From Current Display Extent.

This method of symbolizing the dem surface will dramatically highlight local minima once zoomed in to a detailed work area. When enabled, both zooming in with the scroll wheel, or panning around with click-scroll wheel will instantly modify the stretch and reveal details.

QA: display the analysis_shed and verify that its data area has a watershed-like shape and does not completely fill the clipping rectangle. Verify that this grid is in the working directory for this analysis area.


Calc into ET_M field, or else use the Attribute Table for the 60 cm HE points and add a field of type “Double” for HE_Z, then use the Field Calculator:

HE_Z = ([Z_Min] - 0.8) + ([ET_ORDER] * 0.5 )

For hydrologic modeling, use this formula, and you will have completed this step, and can JUMP AHEAD TO STEP 9a.

Alternately, to better support hydraulic modeling it is necessary to approximate the actual invert elevations for HE features. This can be far more work, and does not improve the horizontal location of features for hydrologic modeling purposes, and as such may not be time well spent. Avoid these hand-edits if possible and just use the formula above.

To ensure flow line compliance with HE features, the HE features must describe elevations below the analysis shed surface everywhere along their length, so they serve as guide trenches to the modeled flow. To guide flow in the proper direction, they must also everywhere slope downhill along the intended path. Without using a simple but unrealistic formula like that above, HE point tuning can be time-consuming. For efficiency here it is helpful to limit concern for fidelity with the actual terrain surface or actual invert elevations in storm pipes.

Modeled flow lines conform to enforcement features when HE paths are
(a) below or well below ground surface, and
(b) sloping, even trivially, in the desired flow direction.

Starting at the lowest spot in the analysis shed (its pour point), define a base elevation that is safely below all upstream features. From there, select an HE line to adjust, use that selected line to select HE point features that it intersects, and ensure that only those points associated with the desired HE line are selected.
Work upstream to calculate values for the entire set of selected points at once with a formula like this one: (an edit session will not be required during this work)

uppermost elevation = top
bottommost elevation = bot
ET_Z = bot + ([ET_ORDER] * (top - bot) )

In practice, it is helpful to set up the formula in a spreadsheet with an easily readable version of the formula printed out in full, and then manually enter the top and bot values into the spreadsheet, while editing the ArcGIS Desktop dialog for Field Calculator to reflect the desired formula.
With a sufficient lack of concern for fidelity with terrain, it becomes possible to work upstream and select all HE lines arriving at a junction and calculate them all at once using the same formula. The payoff is that HE will work properly, and the HE_point_tuning effort will complete more quickly.

QA: Sort the values that have been written into HE_Z, and verify none are <Null>.

Make the output grid xxhe1_he1mb.img.
When complete, verify with the raster’s Properties > Source Tab that the output is single-precision.

Make certain that ArcMap has not expanded the _he1mb.img surface to a grid of double-precision (64-bit) float cells. If necessary, use ArcTools > Data Management Tools > Raster > Raster Dataset > Copy Raster to make a copy of the latest _he1m image that has a NoData Value of ‘-150’ and Pixel Type 32_BIT_FLOAT.
If necessary to create this, then consider a name like xxx_he1mc.img

In the new raster use ArcTools > Data Management Tools > Raster > Raster Properties > Build Pyramids and Statistics to support QA of this hydro-enforcement only grid.



Note: if using 60 cm points and the formula from 8, then jump to step 10)

b. (not necessary with grid from 60cm vertex-derived HE point features) Use morphology functions to widen the HE pathways slightly in two steps.
First use Spatial Analyst Tools > Neighborhood > Focal Statistics with xxhe1_he1m as input, and xxhe1a_he1m as output, Neighborhood as Rectangle, Height 1, Width 2, Units: Cell; Statistics type MINIMUM. Ignore NoData checked to dilate the HE paths.
Second, use the same Focal Statistics tool with xxhe1a_he1m as input and xxhe1b_he1m as output, Neighborhood as Rectangle, Height 2, Width 1, Units Cell, Statistics MINIMUM, and Ignore NoData checked. This will perform a westerly-cell dilation, followed by a northery-cell dilation and assure 4-connected ways along the HE paths (not relying on an 8-connected diagonal jump during steepest-descent analysis.

c. Calculate the screening indicator to test whether all HE paths are below analysis_shed elevations. Use ArcToolbox > Spatial Analyst Tools > Map Algebra > Raster Calculator with a conditional statement like below to an output grid xxhe1_lt1m

Con ( ( “xxhe1b_he1m” < “analysis_shed” ) , 1, 0 )

This will create an indicator Less-Than image with value 1 where the HE path is below analysis_shed, and 0 if it rises above the surface. Most of this image will be non-enforced and set to NoData. If one is completely successful, the resulting image will appear with only ‘1’ values, and zero ‘0’ pixels. The pixel count can be viewed by right-clicking the output raster in ArcGIS Table of Contents > Open Attribute Table.

QA: For screening purposes, when tracking the location of problem areas it can be useful to display the ‘0’ cells in a bright red color, to identify areas in need of adjustment with further HE_point_tuning. Ideally, there will not be any such cells in the analysis shed extent. If there are, return to HE_point_tuning above.



  1. Identify next work unit Marin NHD work units are sub-watersheds intended to yield less than 40,000 stream segments for input to USGS NHD Conflation tools. For the production run of 2013.06, there are 25 work units countywide, aligned with watershed boundaries plus a 200-meter buffer. The work units can be selected from the Feature Class Basin25_boundary_20130613_pg, which contains bounding coordinates in US National Grid. It is suggested to create a working directory, and open a new map document to handle the terrain clipping and QA of the grid that will become this work unit’s analysis_shed.

    This step is also the last opportunity to modify the grid resolution of the analyzed surface. With a 50cm source available countywide, the simplest approach is to resample at a 100cm gridding. Although 70.7cm and 140.4 cm grids have been evaluated, resampling artifacts have been found to degrade the smoothness of ArcHydro flow lines derived from these grids that are not integer multiples of the source. For urban use it is recommended to analyze a 100cm grid, and for larger rural areas, consider the use of a 150cm grid for flow line analysis.

    Create a new work directory based on the name of the analysis unit, using the name in the basin45 boundary as guidance.

    Create a File Geodatabase in the new work directory, and a new Feature Dataset within it to ensure projection and position alignment of the work product. Consider a descriptive name that includes the analysis unit, a work status, and a date to make the File GDB easier to identify as work progresses, sometimes over days, like “Novato_Ck_build_20130107”

    Open a new map document and save it in the same folder as this new File GDB. In ArcGIS Desktop, open File > Map Document Properties to define this work’s metadata. Provide Title that describes the analysis unit, Summary and Description as appropriate. Consider including the analyst’s name as Author.
    - point the Default Geodatabase to the File GDB just created
    - ensure that the Pathnames check box for relative path names has been checked.
    - click OK
    These steps can help path navigation when using ArcGIS Desktop’s Home button to get to the directory where the map document is saved, and finding the active File GDB there.

    Marin NHD data development takes place in WGS 1984 US National Grid (UTM zone 10 north meters) projection and NAVD 1988 (Geoid 2003 meters) vertical position. Consider a name for the Feature Dataset that indicates these such as “U84_N88m_data”.

    QA: verify that this working directory is in a well-known location with sufficient space to perform the many steps following. Marin NHD work has produced analysis directories using 8 GB to 20 GB of storage, per analysis shed.


  2. Clip and Smooth (or Clip a Smoothed) new analysis_shed

    Current practice is to clip using the extent from a single USGS WBD HUC-12 polygon, buffered by 150 meters. A pre-smoothed surface is clipped. For contouring, the surface is derived from the tbsm50cm grid mean filtered over a 7x7, 29-cell circular kernel, then resampled at 1m gridding. For ArcHydro analysis, the surface may be mean filtered over a 3x3 square. The mean filtering improves the signal-to-noise ratio of the surface, while the loss of high frequencies is offset by coarser resampling, which also helps to speed up and improve stability of the ArcHydro analysis.

    As a practical note, if one has a single countywide set of HUC12 polygons, it can be helpful to create a single feature Feature Class for the analysis shed. Even if the countywide collection has been edited to have just one HUC12, export that to a new Feature Class to set the full extent to the bounding box for that single HUC12, rather than the entire collection. This will provide a the correct bounding rectangle for the next step.
    ArcTools > Data Management Tools > Raster > Raster Processing > Clip

    NOTE: If it is necessary to overlay two terrain sources to cover the desired analysis shed, then it is possible to complete all actions in one step using ArcTools > Spatial Analyst Tools > Map Algebra > Raster Calculator with an expression like

    Con(IsNull( newest_raster ) , older_background_raster, newest_raster )

    Then ensure in the Environment settings that the desired clip polygon is referenced for extent, perhaps round the bounding coordinates outward to nearest 25m, point to an adjacent tbsm1m analysis shed for Snap Raster, then in Raster Analysis reference that same tbsm1m raster to obtain Cell Size, and use the clipping polygon as Mask; in Raster Storage set pyramid scheme to Bilinear, Resampling method to Bilinear, and NoData to ‘-150’.

    QA: With pyramids and statistics built for the clipped terrain, it should be possible to display with symbology of Histogram Equalize in the Current Display Extent. If the single-step method was used, be sure to generate a slope image and review the boundaries of the newest_raster to older_background_raster transition.

    Using the ERDAS Imagine 2010 Classic Interface menu Main > Data Preparation > Subset Image... dialog, use local copy of the best-available countywide surface, currently
    as input and direct the clipped output into the new work directory, and rounding the clip extent outwards to the nearest 25-meter value in US National Grid.

    The reduced resolution surfaces (1m, 150cm, 2m) are derived from a 5x5 low-pass filtering of the original 50cm surface of 30 November 2012.
    Consider a name for your clip that includes the work unit such as xxx_clip_tbsm1m.img

    (if you have only a noisy version of a LiDAR surface, do the following)
    Boost the signal-to-noise ratio of the 50cm terrain by smoothing it before resampling at 1-meter gridding. Use ERDAS menu Main > Image Interpreter > Spatial Enhancement... > Convolution... and select your xxx_clip_tbsm50cm.img clip as Input File.

    The default Kernel Library will contain a 5x5 Low Pass kernel; be very certain to check Normalize the Kernel to avoid scaling the terrain values. Consider an output name such as xxx_5x5low_tbsm50cm.img that describes the convolution performed. Click OK to run the convolution filter.

    Next, resample the smoothed terrain to 1-meter gridding using ERDAS menu Main > Data Preparation > Mosaic Images > Mosaic Pro, go to menu Edit > Add Images and navigate to the 50cm clip just created

    a) In the Image Area Options tab, verify the radio button for Use Entire Image is selected

    b) Under Edit > Output Options... dialog, modify Output Cell Sizes to desired dimension, preferably 1.0 and 1.0 meters. Click OK.

    c) In menu Process > Run Mosaic... dialog, Output Options tab, specify the desired background value. For the 2012.12 Marin NHD production run, background elevation is -150 meters, so set Ignore Input Values: to -150, set Output Background Value: to -150, and check Stats Ignore Value to -150

    d) in the File tab, specify a name for your 1-meter version of the clipped terrain. Consider name that includes the work unit, like xxx_clip_tbsm1m.img then click OK to verify the File name, and click OK once again to actually run the mosaic. When exiting, it is not necessary to save the mosaic definition. Exit ERDAS Imagine.
    QA: With pyramids and statistics built for both, compare the original and smoothed DEM using Histogram Equalize symbology and statistics drawn from Current Viewing Extent. Note that scanning rilles (diagonal scratch marks) from LiDAR should be eliminated in the filtered surface.



  3. Trim the base DSM using the analysis_shed
    Using the pre-smoothed surface, it is possible to leverage the buffered HUC12 area to both clip and mask the extracted analysis area. Use ArcToolbox > Data Management Tools > Raster > Raster Processing > Clip with Input of the smoothed seamless surface, Output Extent of the single-feature clipping Feature Class, and be certain to check the Use Input Features for Clipping Geometry check box.

    Steer the output to the working directory created for this analysis extent; typically we have a subdirectory named “grids” to hold the various raster data; be certain to include the “.img” extension if an ERDAS Imagine format is desired, otherwise with no extension the clip will produce an ArcInfo GRID that can not have a name longer than 13 characters and can not begin with a numeric digit. The GRID format might have the best ArcHydro performance. For Marin County, set the NoData Value to -150 (meters).

    Verify the clip has pyramids and statistics by viewing with Histogram Equalize stretch and Statistics from Current Display Extent. A Zoom to Layer should display the proper clipped extent, surrounded by blank NoData areas. If this is all working, jump to Step 5.

    In the new ArcGIS map document and add the work unit’s analysis clip polygon and the 1-meter terrain clip just created. In the Properties for the terrain clip, visit the Source tab and verify that the Uncompressed Size of the clip is not too large. Much faster ArcHydro analysis can be enjoyed with analysis terrains less than 500 MB. The terrain should be a grid of single-precision floating point cells (each four bytes).

    Build a raster mask from the work unit’s envelope polygon with ArcMap using ArcToolbox > Conversion Tools > To Raster > Polygon to Raster. For input, select the 200-meter buffered basin, Value field should be OBJECTID, consider output name that includes the analysis unit, such as xxx_shed_clip.img, Cell assignment to CELL_CENTER, no Priority.
    - Click the Environments... settings button,
    - Expand Processing Extent.
    - Choose Extent from the drop-down list to match your xxx_clip_tbsm1m.img terrain clip.
    - Choose that clip’s layer in the Snap Raster drop-down as well.
    - Expand Raster Analysis
    - In Cell Size drop-down list, choose your xxx_clip_tbsm1m.img terrain clip
    - Click OK to return to the main Polygon to Raster dialog.
    - Click OK again to run the process
    The output mask grid should have a value of 1 in your analysis shed, and appear blank with NoData elsewhere. The mask’s grid cells should align perfectly with each pixel in your 1m terrain clip.

    Set the areas of your 1m terrain outside of your analysis_shed to background value using ArcToolbox Spatial Analyst Tools > Map Algebra > Raster Calculator a formula like this

    Con ( IsNull ( “xxx_shed_clip.img” ) , -150 , “xxx_clip_tbsm1m.img” )

    Consider an Output raster that includes the work unit name like tbsm1m_xxx_shed.img . Ensure that the output is in the work unit’s working directory that you recently created.

  4. Verify analysis_shed Ensure base tbsm1m has been clipped to the analysis shed, with a moderate buffer. Marin NHD has been using 200-meter buffering on an ArcHydro analysis-derived shed; analog-derived sheds might need additional buffering to completely cover all areas draining into the analysis extent.. The base DSM should have its surrounding NoData areas filled with a distinct, but fairly close value; Marin NHD has been using -150 meters for background value at this time.

    Verify that the analysis shed has 1-meter gridding. If using a tbsm50cm source, then use ArcTools > Data Management Tools > Raster > Raster Processing > Resample to downsample---with larger analysis areas of multiple HUC-12 extents, this is necessary to keep the ArcHydro and ET GeoWizard tools from failing. Use Envornments > Processing Extent > Snap Raster of the most recent adjacent tbsm1m grid to improve edge matching.

    QA: display the analysis_shed and verify that its data area has a watershed-like shape and does not completely fill the clipping rectangle. Verify that this grid is in the working directory for this analysis area.

  5. Create new horizontal HE line features (or better, clip from current countywide set)
    If HE features need to be created, see the workflow at the start of this document.
    WARNING: This can be the most resource-intensive step of the entire workflow, as HE feature development effort grows when meeting larger map accuracy scales.
    Several techniques to mitigate this effort are described in this document.

    If a collection of HE features already exists, just use the clip polygons to extract features for just this analysis shed. From ArcToolbox > Analysis Tools > Extract > Clip identify the countywide collection as Input Features, the 150m-buffered USGS WBD HUC12 as the Clip Features,

    5A - Create Initial Hydrologic Enforcement (HE) main channel

    Create a new Feature Class of type polyline-Z. it may not be necessary to have attributes besides a reach name, and even that may not get populated during edits.

    Ensure that features are drawn consistently---Marin NHD used uphill drawing as standard (when in Edit Vertices mode, red vertex should be at uphill end of the HE line) because most manual tracing of drainage is easiest starting low, where fewer main channels are found, and working higher above confluences and into more numerous flow branches.

    Use the best available LiDAR-derived surface directly, not a hillshade. Symbolize the surface using grayscale with black for low and white for high areas, as this creates the best contrast for the low areas on which HE development focuses. Use Histogram Equalization based on statistics of the Current Display Extent for maximized stretch. Explore and develop a feeling for how zooming and panning will affect the feature you will follow. Start with open channel then apply Rule I up through trees to gain a sense of the channel’s appearance under trees. Eventually, with this method of symbolizing the LiDAR-derived surface, creek channels will show up as if they were in an X-ray image. In many cases, switching between a stretched view of the elevation surface and a companion slope view can sharply delineate narrow channels under trees.

    5B - Create Tributary HE channels

    Following the snapping Rule V, extend the main channels upward

    5C - Trace Major Storm Drain System Components

    Following the snapping Rule V, extend the main channels outward. If the reference CAD features you have been provided overlap, breathe deeply and reflect on Rule II

    Create horizontal HE line features (Skip to Step 11 if no HE) Complete edits on horizontally controlled hydro-enforcement (HE) line features. A new Feature Class of type polyline-Z can be created. At first, it may not be necessary to have many attributes aside from a reach name.
    Ensure that features are drawn consistently---Marin NHD used uphill drawing as standard (when in Edit Vertices mode, red vertex should be at uphill end of the HE line.)

    While vertex-snapping of tributaries arriving at a larger stream is desirable, no compelling reason has been found to split the main HE features at every confluence. In fact, when working HE_point_tuning below, it is much more convenient to have longer and fewer HE lines. Since HE_point_tuning is easiest with linear slopes, it is helpful to break HE lines at significant breaks in slope. For example, break HE lines when departing tidal floodplain onto an alluvial fan, and again when the HD line departs the fan and enters a V-shaped valley, and again for HE lines running up steep tributaries.

    This work flow (MNHD flow line workflow Step 5) has been expanded in a separate section at the start of this document
    Marin NHD hydrologic enforcement feature creation procedure

    QA: multi-step
    - Ensure that HE features have been created to represent all elements of the storm drain system, possibly including curb gutters.
    - Verify that all HE features have been drawn uphill, by reviewing any long gutters or ditches that may be built (and were traced) across a divide. If these features were subsequently split, then one part of the feature would be drawn downhill, and must be Flipped.
    - Use the highest-resolution new orthophoto to refine locations of features such as drainage grates, curbs, and stream passage under bridges.


  6. HE line vertex densification (three parts) In order to create an enforcement raster from the HE line features, the HE lines must be converted to a series of points at 1m or finer spacing---the 3D lines can not themselves be converted to raster short of Terrain Dataset work; here we avoid that.

    IMPORTANT REVISION 2014.07.06; this saves an astounding amount of effort and removes need for ET GeoWizards:

    - Clip HE set to within (not touching edge) of analysis shed
    - with plain (not line-Z) HE features, use 3D Analyst > Functional Surface > Add Surface Information, saving Z_min and Z_max
    - create two new attributes HE_min and HE_max, both Double type; populate HE_min = (Z_min - 0.5), and HE_max = (Z_min -0.2)
    - use 3D Analyst > 3D Features > Feature to 3D by Attribute and use HE_min for Height Field --- and HE_max for To Height Field (very important)
    - once the proper HE slope and elevation have been assigned, densify vertices with Editing Tools > Densify to e.g. 125cm for 173cm gridding
    - once densified, convert to point-Z features with Data Magement Tools > Features > Feature Vertices to Points
    - to verify successful sloping of the HE points, use 3D Analyst > 3D Features > Add Z Information to confirm success
    - for performance dissolve to produce a multipoint-Z Feature Class with Data Management Tools > Generalization > Dissolve on ORIG_FID to yield a separate multipoint-Z for each HE line; be very certain to have checked "create multipart features"
    - the resulting multipoints should draw quickly for graphic QA, and be acceptable input for Conversion Tools > To Raster > Feature to Raster
    - use that tool to create an HE raster that snaps to grid and shares cell size with the input tbsm173cm, and uses the 10m buffer polygon as a mask

    To be safe, clip the HE features using the analysis shed clip polygon.

    6a) First, use ET GeoWizards > Polyline > Densify, select an output location that is at File GDB root level, not in a Feature Dataset, give it a name like xxx_HE_dens60cm_li, curve simplification by distance ( where it = 0), and maximum segment length of 0.6 METER to help ensure at least one vertex in key cells at sharp turns. For analysis of a 50cm-gridded surface, use maximum segment length of 0.35 METER. Assign the deviation value to 0.

    QA: ET GeoWizards should announce successful densification of vertices.

    If the analysis shed has been clipped with NoData cells surrounding the area, then it will be necessary to first clip the HE features to be fully within the analysis_shed. This will ensure that Interpolate Shape does not eliminate those features that extend (even a trivial amount) into NoData regions. For this, use Analysis Tools > Extract > Clip on the xxx_HE_dens60cm_li features, to output a ..._dens60cm_li Feature Class.

    6b) Next, ensure that the HE lines are given a reasonable initial (vertical) position using ArcGIS ArcTools > 3D Analyst Tools > Functional Surface > Interpolate Shape on the analysis shed to drape the lines on the analysis_shed. Result can be saved within the Feature Dataset by adding a ‘z’ to the end of the li feature data source, xxx_HE_dens60cm_liz. For vertex spacing less than sampling grid (as here), check Interpolate Verticies Only to evaluate every vertex.
    For complex HE features and sub-meter vertex spacing, this takes quite a few seconds.

    6c) Finally, to support more automated means of setting Z-values after conversion to points, run ArcGIS ArcTools > 3D Analyst Tools > Functional Surface > Add Surface Information to the vertex-densified HE line features. Use Input Surface of the tbsm1m analysis_shed, and be certain to include Z_MIN, Z_MAX, Z_MEAN at least. This step can be fairly fast, because it works on HE line segments rather than the vertices themselves. Large differences between MAX_SLOPE and AVG_SLOPE might be useful to identify HE features that have not been split at a break in slope.

    It may be desirable to use ArcGIS Desktop > Data Management Tools > Features > Adjust 3D Z to globally set the draped features -0.5 meters (below) the analysis_shed surface. (Not necessary if using the quick formula method described below)


  7. HE point-Z generation

    The ET GeoWizards > Convert > Polyline to Point (or Polyline Z(M) to Point for ETGW 10.0) tool assigns useful attributes to the resulting point features. Choose a destination at root level in the File GDB. It is important to check the Vertices radio button and the box “Calculate point position along polylines” to calc the ET_ORDER variable. The resulting features will not be 3D points, but will include Z attributes and crucially, have an ET_ORDER attribute that linearly interpolates in range [0--1] along every HE line feature in the direction its vertices were drawn.

    It has been noted that if the conversion to points is performed on the polyline-Z features from the previous step directly, in the interest of completing the task quickly be certain NOT to include the Z(M) properties of each individual point, unless you want each point to have its unique Z value for other reasons. Also, significant improvements in speed can be enjoyed by running the point conversion on the densified polyline features as they existed before the Interpolate Shape function---then joining the ET_ID attribute to bring over Z_Min; be certain that only the basic attributes are included before starting the conversion. Fewer attributes will also speed up the conversion to points.

    Consider naming the output point features with the densification distance such as shedName_HE_v60cm_pt for 60cm-densified vertices converted to points.

    With complex HE features and sub-meter vertex spacing, be patient. Although ET GeoWizards is typically very fast, this process can take several minutes (or even a very long time) and it won’t be too informative about its progress at first. With fairly intensive HE features, up to about one hour processing per HUC-12 extent could be anticipated.

    QA: ET GeoWizards should announce successful conversion of vertices to points.
    Also, verify in the attribute table that ET_ORDER values range from 0 to 1 and are nowhere constant for a given input feature (sharing the same ET_IDP values).

  8. HE point-Z elevation tuning

    This step is where our flowline-centric hydrologic analysis will diverge from the needs of hydraulic modeling. For us, no benefit is gained by modeling the accurate slope of every pipe and gutter. Instead, enforcement features can be set unrealistically below ground surface, as long as they slope (and thus guide flow) in the intended downhill direction.

    The most efficient approach will use the statistics for each line feature and a simple formula to enforce Z values that are everywhere below the terrain surface (to capture and hold flow lines) and have a nonzero slope that flows in the intended direction. It should be possible to have very few HE features that require the hand-tuning of slope described below. One way to sever all ties to actual surface gradient yet very quickly set up functional HE features is this formula that produce HE features that always remain at least 20cm below the HE feature’s terrain Z_Min value.

  9. Convert HE points to raster To focus attention efficiently on HE reaches in need of tuning, generate a putative HE surface and then analyze an indicator of whether that surface is below the analysis_shed.

    a. Make certain all selections have been cleared and no edit session is running. Then, Run ArcGIS Desktop’s ArcToolbox > Conversion Tools > To Raster > Point to Raster using the current HE_point set. If features are densified to less than 1-meter spacing, it should be fine to use Point to Raster with HE_Z as value field and MINIMUM as Cell assignment type.

    Configure the Environment Settings:
    - Processing Extent > Extent should match your analysis_shed
    - Processing Extent > Snap Raster should also match analysis_shed
    - Raster Analysis > Cell Size should match analysis_shed, and for good measure
    - Raster Storage > Pyramid resampling technique is best if set to BILINEAR
    - Raster Storage > NoData (set to -150)

    Whichever of these methods is used, expect it to create a 1-pixel wide expression of the HE paths; a versioned name of this output (up in the file system, not down in the geodatabase, place output in a folder at the same level as the geodatabase) for site “xx” and 1-meter analysis would be “xxhe1_he1m.img”.
    For larger areas, this might take roughly 45 minutes per HUC-12.
    Note that some background cells may be NoData and others are ‘0’ (The raster may have a Low Value of negative infinity). If this is the case it must be corrected. To do so, launch Raster Calculator with Spatial Analyst Tools > Map Algebra > Raster Calculator.

    SetNull((“xxhe1_he1m.img” == 0), xxhe1_he1m.img)

    Clip the streamlines to the _shed1m_paek5m_simp30cm_pg polygon derived from the dissolved 100k catchment polygons. Use ArcTools > Analysis Tools > Extract > Clip with input of xxhe1_str2k_li. Consider an output feature class name like xxhe1_str2k_cat_clip_li.-


    Before opening an edit session again on the File GDB, use Open Attribute Table on the xxhe1_str1k_li flow line features. Ensure that these features are selectable, then run Selection > Select By Location to select features from xxhe1_str1k_li that for Source layer mvhe1_cat_m1m_pg analysis_shed have the Spatial selection of “intersect the source layer feature” with Apply a search distance unchecked. This should ensure that flow lines are within and not touching the edge of the catchment. Simply using the “Apply search distance” of -1 meter can take several times longer, and doing this during an edit session will extend the times even more.

    - With the complex selection completed on xxhe1_str1k_li, open an edit session on the File GDB, SWITCH THE SELECTION (so as not to delete the analysis results), and delete the extraneous drainage line features.


    QA: Take a moment to display the flow lines being retained over the clipping “_shed1m_paek5m_simp30cm_pg” polygon. Verify that no extraneous flow lines remain outside of the analysis_shed polygon, and no flow lines paralleling the edges have been broken by within the clippingrectangle.


  10. Generate gridded Hydro-Enforced Surface for ArcHydro analysis With a clean set of HE paths expressed as an elevation grid from the previous step, simply replace the elevation values in analysis_shed with the HE values wherever they exist. Do this with Spatial Analyst Tools > Map Algebra > Raster Calculator to an output grid xxhe1_hes1m

    Con ( IsNull ( “xxhe1_he1mb”), “analysis_shed”, “xxhe1_he1mb” )

    This uses analysis_shed everywhere except along HE paths, where it uses the dilated HE paths created during the previous step. The output is the full hydro-enforced surface (HES) used to drive ArcHydro steepest-descent analysis.

    Grids produced by Raster Calculator may benefit from manual generation of pyramids and statistics. Do this using Data Management Tools > Raster > Raster Properties > Build Pyramids and Statistics on the just-created HES. It may be necessary to navigate to the grid, as sometimes drag-and-drop does not work into this dialog. Check boxes for Build Pyramids and Calculate Statistics, and un-check Skip Existing to ensure that these are built fresh. In Pyramid Options, use Pyramid resampling technique BILINEAR.

    QA: With pyramids and statistics built on it, examine the candidate surface viewing in grayscale, with Histogram Equalization symbology, and statistics based on Current Display Extent. Verify that darker (deeper) channels have been cut into the surface for enforced features. (n.b. There may be some value of negative infinity which will make the full extent appear black. Zoom in on some channels to see better.)

  11. Run ArcHydro (1 of 8) Fill Sinks With an appropriately new version of ArcHydro installed from The ArcGIS Hydrology Resources page and the ArcHydro Tools toolbar turned on and parked, start using the new HES. If this is the first time your ArcGIS Desktop install has been running ArcHydro, it is worth an initial visit to ArcHydro Tools > ApUtilities > Set Target Locations > HydroConfig to set Map Level paths to your preferred locations for Raster Data and Vector Data. At the limits of ArcHydro’s ability to work with large grids, it might be beneficial to allow it use of native Arc GRID format, and simply set a directory path for raster. The Vector Data path is best set to a local (not network) File GDB>

    (This would be a good time to save the map, close ArcMap and relaunch it.)

    With paths thus set, use ArcHydro Tools > Terrain Preprocessing > DEM Manipulation > Fill Sinks. If the application directory paths are set properly, your HES should be available on the DEM pick list. No Deranged Polygons are typically used here; consider that ArcGIS GRID output names are should not begin with a number nor have more than 13 characters in their names, and provide a desired Hydro DEM name for the filled version of your HES. With the example input xxhe1_hes1m for HES, the ArcHydro filled hydro-DEM could be simply he1_fil ; choose the Fill All radio button.

    This step could take about 15 minutes per HUC-12 extent.

    QA: ArcHydro should announce successful filling of sinks in the hydro-enforced surface.

  12. ArcHydro (2 of 8) Flow Direction This step uses the (local minima-filled) hydro-DEM input to calculate, for all cells, the steepest-descent flow direction the in an 8-connected sense. Use ArcHydro Tools > Terrain Preprocessing > Flow Direction and select an output like he1_fdir.

    This flow_direction grid will be used several times in subsequent analysis steps.

    This step could take about one minute per HUC-12 extent.

    QA: ArcHydro should announce successful generation of the Flow Direction grid and display and 8-value grid 1,2,4,8,16,32,64,128.

  13. ArcHydro (3 of 8) Flow Accumulation This evaluates the flow_direction grid to sum up the catchment areas for each cell in the analysis_shed extent. This analysis takes several times (5X) longer than flow_direction, but is the foundation for much of the final line work. Use ArcHydro Tools > Terrain Preprocessing > Flow Accumulation and select an output like he1_facc. (Note: if you find ArcMap ‘Not Responding’ - just let it run)

    This flow_accumulation grid will be used several times in subsequent analysis.

    For QA purposes, or with few cycles of refinement of HE_point_tuning completed, it may be worthwhile setting the display of the flow_accumulation grid to make the main pathways visible, and overlay the HE line features with transparency. If there are discrepancies that should be corrected, this is an opportunity to return to the HE_point_tuning task, focusing only on the reaches where flow has not been adequately captured by the HE pathways. If the flow_accumulation grid is too difficult to symbolize clearly, then consider the same QA step using the stream_definition grids generated in the following step.

    This step can take up to 45 minutes per HUC-12 extent.

    QA: ArcHydro should announce successful generation of the Flow Accumulation grid.

  14. Flow Accumulation Grid Inflow Patching (for downstream analysis_shed areas)
    (Not always performed - if there are no through-going channels proceed to the next step)
    (Also, an alternative approach accumulates catchment later in the vector features>

    In some circumstances, the analysis shed will be downstream of one or more previously analyzed drainages. In these cases, it is necessary to augment the flow accumulation on any through-going channels. Presumably these major channels will already have HE features defining them, and so this step creates a raster to supplement the flow accumulation grid by the inflowing flow accumulation from any through-going flow lines. In Marin County, most onshore drainage areas will not fit this case, but offshore artificial paths may need to be augmented in order to account for through-going flow. It is expected that only one or very few flow lines, if any, would be through-going in most cases.

    In a new map document (not the current working one), add these layers:
    - the current terrain_clip_pg, analysis shed tbsm_1m raster
    - the flow accumulation raster (e.g. “pet3_facc”) from the upstream analysis shed
    - the current flow accumulation raster from the previous step
    - the current HE line features

    Keep the new map document open, but close the current working map document to avoid conflicts with feature class access.

    Explore at the individual pixel level where the outflow from the upstream shed intersects the current analysis clip. Identify the flow accumulation value where the path crosses into the current analysis shed, and record this value for use later in this step. It is likely a big number, so be certain to get every digit [e.g. Petaluma outflow was (189,271,760 square meters] For the current analysis shed, consider this value to be named “inflow”
    (e.g. inflow = 189271760 )

    Following a simplified version of the approach used to create the HE contribution to the hydro-enforced surface, create a new feature class from just the through-going HE features; these are lines that must have an entire analysis shed or more arriving into the current analysis shed---the output of one whole drainage analysis arrives at an upper edge of the current analysis shed. Select the one or very few HE lines that convey this flow through the current analysis shed. The downstream end of this line will likely be the single outflow from the current shed.

    In an Edit session, merge the features in the new through-going HE feature class into a single feature, multi-part if need be (this is not likely).

    Use ET GeoWizards > Polyline > Densify as before to create vertices every 60cm along the length of the through-going line.

    Then use ET GeoWizards > Convert > Polyline to Point to convert these vertices into a point feature class along the through-going path. If properly merged into a single feature, there should be an attribute named “ET_ID” that contains the value 1 for all points.

    Convert the points to a raster that should match, pixel for pixel, the current analysis shed’s through-going HE path. Use ArcToolbox > Conversion Tools > To Raster > Point to Raster with Value field set to ET_ID or other field uniformly filled with “1” values.
    Call the output xx_thru_indicator.img” Also, use these Environment Settings:

    - Processing Extent > Extent should match your current analysis_shed
    - Processing Extent > Snap Raster should also match current analysis_shed
    - Raster Analysis > Cell Size should match current analysis_shed
    - Raster Analysis > Masking using the appropriate clipping polygon

    If necessary, clean up no data values around the stream area. To do this run Spatial Analyst Tools > Map Algebra > Raster Calculator to an output grid. Call the output xx_thru_ind2.img.

    SetNull(("xx_thru_ind.img" == 0),"xx_thru_ind.img")

    Note that this approach must not add extraneous width or corner pixels in the main flow accumulation path. After identifying the putative path in raster form, consider cleaning up by multiplying with an indicator such as (xx_facc > 2000), so that only those path pixels that are along the larger flow accumulation paths will remain. The value of 2000 may seem small, but in many cases the largest through-going streams will appear with rather small flow accumulation values at their upstream end, and this value works fairly well.
    Neglecting this step will produce fuzz-like burrs of short flow lines around the main stream that do not accurately reflect flow into the main channel, and also create extraneous large-extent catchment polygons along the length of the main, through-going stream.

    To augment the current flow accumulation with the through-going flow accumulation values, run Spatial Analyst Tools > Map Algebra > Raster Calculator to an output grid in ArcGIS GRID format in the Layers folder called xx_faccp to imply “flow accumulation plus”

    Con ( IsNull ( “xx_thru_ind2.img”), “xx_facc”, ( inflow + “xx_facc” ))

    Double-check to be certain that the Map Algebra expression has inflow accumulation added to the flow accumulation raster from the previous step.

    If this step has been followed, then make certain to always use “xx_faccp” rather than “xx_facc” in subsequent steps. It might be prudent to rename the unmodified “xx_facc” to “xx_xfacx” to avoid any confusion.


  15. ArcHydro (4 of 8) Stream Definition This creates an indicator grid for lines of cells that describe flow up to a designated catchment threshold. When using a 1-meter HES grid, there is an identity relation between integer cell count (of cells draining through each cell in the analysis_shed), which the Flow Accumulation contains, and catchment area in square meters---they are the same value. Stream Definition is the point in the analysis where the ultimate density of analyzed ephemeral flow lines is set. Attributes can be set to dynamically reduce the density of ephemerals in the resulting flow line features (and only display larger streams), but no more detail can be displayed than is set here at the Stream Definition stage.

    For practical purposes, it has been beneficial to generate both coarse (large catchment threshold) streams to get larger catchment polygons for clipping, and also fine (small catchment threshold) streams to generate flow line geometry. Both will use the same flow_accumulation grid and be co-registered, but the small catchment streams will be vastly more numerous.

    For Marin NHD, the coarse size is typically 100,000 square meters (0.1 sq km), to produce 10 catchment polygons per square km. The fine size is 2000 square meters (0.002 sq km) to define stream lines through ephemerals with nominal half-acre catchments.

    For generation of HUC-16 catchments, it may also be desirable to generate a third stream definition, with 1,000,000 square meters (1 sq. km / 247 acres) area to simplify the aggregation steps.
    NOTE: this 1 sq. km catchment grid is also valuable for the most rapid way to produce a precise 1m-grid analysis_shed polygon; this polygon can be important for clipping of the 100k and 2k streams for flow network generation.

    For this, create the coarse definition for catchment use with ArcHydro Tools > Terrain Preprocessing > Stream Definition. Use the he1_facc flow accumulation grid from the previous step, define the number of cells as 100,000 (no comma) so that catchment Area is 0.1 sq km. For analyses with 50cm cells, use 400,000 to obtain 0.1 sq km. and consider an output name like he1_str100k for a 100,000 square-meter stream definition. Stream definition grids are fast to create.

    Then create the fine definition by launching Stream Definition again and setting Number of Cells to 2000. For analyses with 50cm cells, use 8000 to obtain 0.002 sq km. For analyses in heavily forested areas with more sparse LiDAR ground point spacing and 1m gridding, consider setting Number of Cells to 4000 to run 1-acre catchments. Consider an output name like he1_str2k for a 2000 sq m definition.

    For the three steps, both / all of these grids will be run through the analysis flow.

    (If iterating and a 1m-grid definition of the actual analysis_shed exists, its polygon can now be used to clip the stream definition grid)

    QA: ArcHydro should announce successful completions of Stream Definition.

  16. ArcHydro (5 of 8) Stream Segmentation In this step, the two stream indicator grids from the previous step are analyzed into reaches among sources, confluences, and sinks. The result is an integer grid that contains a unique identifier for each reach in the analyzed area.

    First run ArcHydro Tools > Terrain Preprocessing > Stream Segmentation using the flow_direction grid created earlier, the 100k stream grid he1_str100k and create an output Stream Link Grid. Consider an output name like he1_slk100k . The coarse grid will complete quite rapidly, and by expanding its symbology in Table of Contents one can quickly see how few stream segments result with this catchment.

    Next run Stream Segmentation using the same flow_direction grid, the 2k stream grid he1_str2k and consider an output name like he1_slk2k . Verify that the Flow Direction grid is the current _fdir grid, otherwise a separate segment could be generated for each grid cell. This should also complete rapidly, and may show about 10X the number of stream segments as the coarse grid.
    For a larger 4xHUC-12, the 1M gave 150 segments, the 100k gave 1500 segments, and the 2k gave 65,000 segments.

    (If iterating and a 1m-grid definition of the actual analysis_shed exists, its polygon can now be used to clip the stream link grid)

    QA: ArcHydro should announce successful completions of Stream Segmentation.

  17. ArcHydro (6 of 8) Catchment Grid Delineation Here, the area extent of catchments for each stream segment are defined, as an integer grid that should have as many catchments as there are stream segments---at the chosen catchment definition.

    First run ArcHydro Tools > Terrain Preprocessing > Catchment Grid Delineation using the flow_direction as before, and the 100k stream link grid he1_slk100k , and consider an output name like he1_cat100k for the grid. The output should be grayscale patches of irregular shape with values between 1 and the number of 100k stream segments. The

    Next run Catchment Grid Definition as before but use stream link grid he1_slk2k and consider an output name like he1_cat2k for the grid.

    If running three grid sizes, the 1 sq. km. grid may take 5X as long as the 100k catchment delineation to complete.

    (If iterating and a 1m-grid definition of the actual analysis_shed exists, its polygon can now be used to clip the catchment grid)

    QA: ArcHydro should announce successful completions of Catchment Grid Delineation.

  18. ArcHydro (7 of 8) Catchment Polygon Processing This is the raster-to-vector analysis that produces catchment polygon features for each of the stream segments that will soon be generated. It’s also the last step where we’ll try to run both coarse and fine stream resolutions.

    First run ArcHydro Tools > Terrain Preprocessing > Catchment Polygon Processing using the he1_cat100k grid definition. Here, for the first step in this process, the output will be directed to the local File Geodatabase that should have been set during ArcHydro step 1 above. Just provide an output name that does not start with a number; the 13-character limit does not apply, although be cautious with special characters. Consider a name that includes an abbreviation of the analysis_shed, so for area “xx”, one might use xxhe1_cat100k_pg to include the hydro-enforcement version as well. In some cases, HE features near the edge of the analysis_shed will modify the outer catchments.

    NOTE: If the 1 sq. km catchment aggregation extends to the edge of the analysis_shed in any upstream extent, then the divide was not included in the analysis_shed and appropriate adjustments should be made to the analysis extent polygon. Typically it will be possible to return to Step 10 and extend the _hes1m.img grid with terrain from the countywide grid.

    Next run Catchment Polygon Processing using the more intensive he1_cat2k grid definition.

    Consider an output name like xxhe1_cat2k_pg and expect the process to take a bit longer than with the less numerous 100k catchments.

    QA: ArcHydro should announce successful completions of Catchment Polygon Processing.
    Verify that there are no upstream areas (not touched by an inflow stream) where the 1 sq. km catchment polygons extend to the edge of the analysis_shed. If they do, carefully identify how far to extend the analysis, augment the hydro-enforced surface, and return to Step 10 to continue the workflow.

  19. ArcHydro (8 of 8) Drainage Line Processing This is where the fine grid is used to generate the base flow line features that will continue in the work flow. Only the fine grid is analyzed, since it should contain the same geometry that would be created if the coarse grid were analyzed, and also more detailed features.

    From ArcHydro Tools > Terrain Preprocessing > Drainage Line Processing use the 2k stream link grid he1_slk2k and the flow_direction grid he1_fdir that have been used in previous steps.

    For output feature class, consider a name that indicates the analysis_shed being analyzed, modeling xxhe1_str2k_li for stream flow lines starting below 2k square meters catchment.

    For convenience when aggregating WBD HUC-16 areas, it is helpful to continue with 100k flow line generation, so that a reasonably-sized set of default pour points can be generated.

    QA: ArcHydro should announce a successful completion of Drainage Line Processing.

  20. Flow Line Analysis_Shed clipping Since a buffer was applied to the analysis drainage area, there will likely be many extraneous flow lines around the outside of the rectangular analysis grid. With them, your newly-generated flow lines may have an injection-molded look, and here we will cut away the artifacts.

    (If the previous three grids have been clipped by a pre-existing 1m-grid cachment polygon, then this step is not necessary)

    - Open an ArcGIS Desktop edit session where you can modify the coarse catchment polygon Feature Class. In the target File Geodatabase, you may have a Feature Dataset named Layers that contains the catchment and drainage line features created in earlier steps. When starting the edit session, do not select for editing either of the ArcInfo Workspaces that may have been automatically created by ArcHydro. It might be helpful to right-click the 100k catchment polygons in Table of Contents and use Selection > Make This The Only Selectable Layer so that you can leave the drainage lines visible. Examine the outer boundary of the analysis shed, select any 100k catchment polygons that are draining outward, and delete them. Unless the analyzed drainages are along a shore or coastline, there should only be one outlet (pour point) where the flow line network exits the analysis shed. Close the edit session.

    - Open Attribute Table on the xxhe1_cat100k_pg features. Since the HydroID and GridID attributes should be identical once the edit session is closed, it should be possible to use Field Calculator to set all HydroID values to the same value, such as ‘1’. Add a field, give it a name like “diss” for dissolve. Use the Field Calculator, set the value to 1. With a constant-value attribute available, from Arctoolbox select Data Management Tools > Generalization > Dissolve using the modified xxhe1_cat100k_pg as input,and dissolve on “diss” and consider an output target in the same Feature Dataset named Layers that contains the other catchment polygons, and call it xxhe1_cat_pg since it is a single polygon for the entire analysis_shed catchment, calculated using HE features version 1. Uncheck the option Create multipart features. Click OK to dissolve. Check HydroID as the Dissolve Field, and OK to go.

    Alternately, make a copy of the 1 sq. km catchment polygons [call it something like xxhe1_shed1m_pg ], open an edit session, select all polygons in this Feature Class and then use Editor > Merge. Save edits, close edit session.

    To trim more carefully, use ArcTools > Cartography Tools > Generalization > Smooth Polygon with PAEK values of 5m on the _cat100_clip_pg to an output with name like _shed1m_paek5m_pg and use this for clipping. In the rare case of a tributary running parallel to the catchment within 1 analysis grid cell distance, this can avoid certain problems with clipping. This circumstance can occur when fire roads have been cut along ridge lines that divide the edge of the analysis shed.
    For better clipping efficiency on large areas, consider simplifying the smoothed polygon with ArcTools > Cartography Tools > Generalization > Simplify Polygon using a Maximum Allowable Offset of 30cm / 0.3m, to greatly reduce the number of vertices for the shape.

    To trim more carefully, use Analysis Tools > Proximity > Buffer with input of xxhe1_cat_pg and set buffer distance to “-1” meters. Consider an output name like xxhe1_cat_m1m_pg to suggest “minus 1 meter” and put it in the same Feature Dataset named Layers that contains the catchment and drainage line features created in earlier steps
    Alternately, use Analysis Tools > Proximity > Buffer with input of xxhe1_cat_pg and set buffer distance to “-0.5” meters. Consider an output name like xxhe1_cat_m50cm_pg to suggest “minus half meter”.

  21. Catchment Extent Trimming to within actual analyzed drainage shed only
    Depending on the distance buffered to produce the analysis_shed grid, there will likely be an annulus of small 2k or 4k catchments around the outside of the analysis area, and once the detailed flow lines within the actual catchment have been identified, annulus catchments not covering a flow line within the actual catchment should also be removed.

    (If the previous three grids have been clipped by a pre-existing 1m-grid cachment polygon, then this step is not necessary)

    Using the trimmed “xxx_str2k_li”, use ArcMap > Selection > Select by Location on the “xxx_cat2k_pg” features with “xxx_str2k_li” as Source and are within a distance of the source layer feature of about 2 grid cell sides. Visually confirm that the proper extent has been approximated, Open Attribute Table on the “xxx_cat2k_pg” features, switch Selection. Typically, the 2k catchments are too complicated for this method to work in one step. Use the Selection tool, set Selection > Interactive Selection Method > Remove from Current Selection and ensure that no headwater catchments remain selected within the actual drainage shed. Visually it might be fastest and easiest to perform the selection with the _cat2k_pg features rendered in solid green with no outline. Displaying the _str2k_li overlay in dark blue makes it easier to see wayward selections within the actual drainage area. When only the annular catchments not participating in the actual drainage, use n _cat2k_pg, context menu > Edit Features > Start Editing, click the blue-haloed “X” in the opened attribute table to delete selected, Save Edits and Stop Editing.

    If needed, repeat this process for the 4k catchment polygons as well.

  22. Flow Line De-burring (for areas with well-developed channel cross-sections)
    Apparently, the 1m and finer gridded surfaces, when analyzed with ArcHydro beta for ArcGIS 10.1, are producing numerous short (1- or 2-cell long) segments for centerlines within well-defined banks. Most of this effect can be avoided by carefully removing excess through-flow pixels as described in the Note for Step 14.

    Also, the method of using larger analysis areas, with multiple HUC-12 extents to minimize any need for flow accumulation will eliminate the need for this step.

    - generate end points to the various FCoded segments of the HE features using
    Data Management Tools > Features > Feature Vertices to Points with Point Type DANGLE. Select lines segments that cover the dangle points, then select from currently selected features lines with length < 8m or similar filter; open edit session and delete selected burrs, Save Edits and Stop Editing.

  23. ArcHydro Adjoint Catchment Processing This step identifies all upstream catchments for every generated catchments and is necessary to complete the flow line network definition. Because the count is so much smaller, this step is much faster to run on the 100k catchments than on the 2k catchments. It can take many hundreds of Passes to complete the analysis for smaller catchment areas---note the progress text in the lower-left corner of the ArcMap window. It is invoked with ArcHydro Tools > Terrain Preprocessing > Adjoint Catchment Processing. Input your clipped 100k or 2k flow lines and their associated catchment polygons, then specify the name of an output polygon feature class that will contain the adjoints. This feature class will have a great many overlapping polygons, and it may be difficult to visualize.

    Be very patient with this step as it runs quite slowly. Although for a 4xHUC-12 extent the 100k adjoint catchments may take 1 minute, the step on 2k catchments might take 40x longer.

  24. ArcHydro Drainage Point Processing This identifies pour points for each stream segment. The step can complete quickly, but will not succeed if there are many burrs created by the patch of through-going flow accumulation. It is invoked with ArcHydro Tools > Terrain Preprocessing > Drainage Point Processing. In the dialog box, carefully select the latest flow accumulation grid; choose the throughgoing-flow patched accumulation if you have created it. For a given stream definition (e.g. 100k or 2k,) choose its associated catchment grid, where cells have values for the ID of the catchment, the catchment polygons (not the adjoints), and the desired output drainage point Feature Class name.

    If substantial edits have taken place for outer-area catchments not in the analysis_shed proper, it may be helpful to use a grid-derived polygon of the analysis shed as clipping on the _cat2k catchment grid. When this is trimmed, the grid will more properly align with the remaining _cat2k catchment polygons to produce an accurate set of drainage points.

  25. ArcHydro Catchment Cleanup Experience in San Geronimo-Lagunitas HUC-12 has shown that minor inflow burr features along the main creek channel lead to splits in the main flow line at every junction and a collection of many more catchments along the main channel than are needed for an accurate network of 100k square meter catchment streams.

    Ironically, it may be reasonable to expect that much less interactive editing will be needed for the far more complex 2k flow lines---only a patch to set NextDownID = -1 at the outflow into now-removed catchments generated in the buffered area. The extra effort merging the 100k catchments will pay off when aggregating new WBD subdivisions.

    It appears to be necessary to clean up these minor-catchment artifacts to have an accurate set of 100k catchment polygons---and their associated drainage points---to assemble HUC-14 or HUC-16 catchment areas. Inconveniently, every drainage point removed, and every catchment and flow line segment merged requires an adjustment to the associated network attributes.

    The Drainage Point’s DrainID at the outflow of a merged catchment must be accurately reflected in the merged flow line segment’s DrainID field. Likewise, the HydroID of the flow line segment that the drainage point pours into must be accurately reflected in the NextDownID of the upstream flow line segment. Also, the HydroID of the downstream catchment must be accurately reflected by the NextDownID of the upstream catchment.

    As these features are merged, it appears to be necessary to work upstream from the pour point of the analysis shed. There, no further downstream flow line or catchment exists (even if one was generated but was later removed by catchment trimming.) This pour point of the entire analysis shed has NextDownID = -1, to indicate flow is leaving the shed. For inland sheds or any with only convergent flow there should only be one pour point, with one flow line and one catchment feature each having a “-1” value as NextDownID. For coastal or bay fronting sheds without integrated topo-bathy modeling there may be a small number of separate pour points, but only one per watershed.

    One way to make this merging work and still be able to build the network is to edit together an association of drain points, flow lines, and catchment areas from a particular analysis run such as the 100k stream definition. Starting from the pour point, the next upstream junction to be preserved is identified, along with the two drain points associated with the upstream branches from that junction. All of the flow line segments are graphically selected and merged, preserving the attributes of the downstream-most merged segment. Then, that HydroID is placed in the NextDownID attribute of the flow line segments of the two upstream branches that intersect the drain points associated with that junction. Similarly (and just afterward) all of the catchment areas traversed by the newly-merged flow line segment are selected and merged into the downstream-most catchment. Then its HydroID is copied into the NextDown ID of both upstream branches above the junction being preserved.

    For the editor, this post-merge experience will likely involve viewing the junction at large enough scale to see the two drain points, and graphically selecting three flow line segments and three catchment polygons. The upstream features can be identified as having matching (though incorrect) NextDownID, and the downstream feature’s HydroID can be selected and copied fairly easily. Save your edits...

  26. ArcHydro Drainage Network Building This builds a network based on coherent sets of NextDownID attributes, and can be built for flow lines as well as catchment features. The network is built by ArcHydro within the Feature Dataset that contains the flow lines, and apparently derives from Utility Network Analyst functions already in ArcGIS Desktop, and does not appear to require the Network Analyst extension to ArcGIS Desktop.

    If the preceding step has been completed flawlessly, then the function at ArcHydro Tools > Network Tools > Hydro Network Generation will just work, although typically the tool will run and at some point complain about “Mismatch in NextDownID”. Don’t fret, for if the errors are sparse it is fairly straightforward to get them resolved.

    In the Hydro Network Generation dialog, choose the drainage line features (say 100k) that were cleaned up in the previous step for Drainage Line. Then choose the catchment polygons that were built for those flow lines (for example, also 100k) as the input the Catchment. Choose the drainage point features generated and cleaned up in the previous steps for Drainage Point. Outputs will be generated in the same Feature Dataset used by other ArcHydro tools, so provide distinct names that include the stream definition area for the outputs Hydro Edge like xxx_edg100k_li networked flow lines, Hydro Junction xxx_jct100k_pt network node points, and Network Name xxx_hyd100k_net network feature. The network feature is a set of relationship tables and will show up in the Feature Dataset with a distinct network icon.

    One approach that has worked is to retain 100% of the drainage points, and work exclusively with flow lines and catchments derived from a precisely clipped raster grid. The clipping rectangle is derived from a dissolve of 1 sq. km catchments, and then applied to the stream grid, the stream link grid, and the catchment grid for the smaller shed areas---prior to generating catchment polygons, flow lines, and drainage points. It can take a long time to build, like about two hours per aggregated HUC-12 extent, but it can build successfully.

    Sometimes, it may be necessary to iterate toward a successful network build, cleaning up mismatched NextDownID flow line and catchment features along the way, quite a few versions of this network may get built. Be aware that while the Network feature exists in the Feature Dataset, the Feature Classes that participate in it (junctions and edges) will remain locked and not delete-able until after the network has been removed from the Feature Dataset.

    Another option is to leverage the topology that is built into a successful network, and run Hydro Network Generation on the full extent 100k catchment, 100k flow lines, and 100k drain points, then edit them down to the actual analysis extent. Network elements selected and deleted during an edit session will be kept in sync by the network topology. For example, splitting a flow line just below the outflow drain point will immediately generate a new network junction.

    When the network build fails, open the junctions Feature Class attributes and select the the last one built, then zoom to it. Following the similar procedure as the cleanup step above, select the three flow line segments and three catchment polygons around the junction, and examine their attributes to ensure that the HydroID of the downstream feature is accurately reflected in NextDownID of both the upstream branches---for both flow lines and catchment triads.

  27. Drainage Network rationalization After a successful network build, wherever there are transfers of flow between adjacent analysis_shed areas it will likely be necessary to manually fix the boundary for the transfer. Since this will never be at a divide, a choice must be made as to exactly where the upstream shed will have its sink, and the downstream shed have its corresponding source in their respective flow networks. In some cases, the WBD criteria can be helpful, requiring that the split take place across rather than along a flow line, and suggesting that flow line confluences, even if inundated by reservoir or subtidal, are also candidate locations.

    There appear to be four adjustments necessary. First, the analysis_sheds as defined with (1m) gridding should be adjusted so as to not overlap or have large gaps. If necessary, make reference to smaller catchment areas such as those from _cat3k_pg or _cat100k_pg to obtain grid-edge geometry to merge. In some cases it may be beneficial to Union adjacent analysis_shed areas (with Gaps Allowed) and save only the remainder to resolve complex overlap.

    Second, the flow lines _str3k_li from flow-exchanging sheds should all fit within their corresponding analysis_shed polygon. During editing in ArcGIS 10.2, the snapping functions appear to work much more effectively with these flow lines than with their corresponding network edges, so it may be helpful to keep the flow lines visible with subsequent edits to network edges and network junctions.

    Third, the _edg3k_li network edges should correspond with any edits made to the flow lines, both by cutting or snapping an endpoint at the transition and to remove any extraneous flow lines after any final adjustments to the analysis_shed polygons. Snapping performance may be profoundly improved by using flow lines as a snapping feature rather than any network edges or junctions.

    Fourth, the _jct3k_pt network junctions at the transition should be snapped to the endpoints shared by the flow line and network edges at the transfer. Be careful not to delete any inlet or outlet junction that will be moved to the end of a shortened edge. Set the AncillaryRole attributes should be set to Sink for the downstream outlet of an upland analysis_shed, and set to Source for the inlet at the upstream junction for that downstream analysis_shed. The FType text field for a Source can be set to “Drainage Inlet” and for a Sink to “Drainage Outlet”. Likely one of the junctions can be left at a location generated and its JUNCTION_PLACEMENT_DESC can be set to ARCHYDRO, while any junction snapped to the chosen transfer location can have JUNCTION_PLACEMENT_DESC set to MANUAL.


  28. Flow Line Conflation with NHD High Resolution and County Streams Features
    This step involves fourteen sub-steps to transcribe the NHD High identifiers and feature classifications from conflated hydro-enforcement line features over to our new model-generated NHD Local flowline features. Certain steps appear to require ArcInfo licensing to run topology-related functions.

    Preliminary: Do yourself a favor and create a new map document just for the conflation steps here; Open a blank and save it into the same working directory as you’ve been using, so that the Home button will take you to the active File Geodatabases.

    28a) Generate end points for the various FCoded or ReachCoded segments of the HE features by first selecting those where “( "ReachCode" IS NOT NULL ) OR ( "FCode" IS NOT NULL)” then export selected features to a new feature class. Run ET GeoWizards > Convert > Polyline to Point with the new HE subset as input, output at the File Geodatabase root level above the Feature Dataset, conversion option Nodes, and Remove Duplicate Points checked; save output as xxx_HE_conf_pt

    28b) generate end points from the str2k flow line features with ET GeoWizards > Convert > Polyline to Point with xxx_str2k_li as input, output at the File Geodatabase root level above the Feature Dataset, conversion option Nodes, and Remove Duplicate Points checked; save output as xxx_str2k_end_pt.
    Alternately, if you have had a successful network build, simply use the existing _jct2k_pt junction points for this purpose.

    28c) In a few cases, HE features might be drawn farther upstream than the str2k flow lines. To identify these, use ArcMap Selection > Select by Location and identify HE endpoints within a distance 3m of the str2k flow lines themselves (this seems faster than checking against edges). This should select all but just a few uppermost end points.
    On those selected HE end points, again use Selection > Select by Location with Selection method “remove from the currently selected features in” to remove those HE end points less than ~3m from the flow line end points (these should leave selected only those pseudo-node endpoints, where FCode changes within a reach, as distinct from existing network junction points); save output as xxx_HE_onlyconf_pnode_pt

    28d) use ET GeoWizards > Point > Perpendiculars to Polylines to create slice lines from the (pseudo-node) points just identified to the str2k flow lines with a tolerance of ~3m; save output as xxx_HE_pnode_slice_li.

    28e) use ET GeoWizards > Convert > Polyline to Point with these slice lines to create split points where one of each end will be on the str2k flow lines, use the Nodes conversion option, save output as xxx_HE_pnode_slice0_pt if desired. Next, delete those slice points that are not covered by the pnode points (select NOT identical to, or perhaps within a distance of 0.005m of the _onlyconf_pnode_pt points from step 27c).
    Inspect the selection to ensure that the shortest slice lines (less than 0.002m length) have at least one end point selected, so that exactly half of the end points are selected. Save output as xxx_HE_pnode_slice_pt in the Feature Dataset that contains the network edges.

    28f) use ArcTools > Data Management Tools > Features > Split Line at Point with the slice points and a search tolerance of 0.1 meter to split _edg2k_li network edges derived from the str2k flow line features; save output as xxx_edg2k_pnsplit_li

    28g) use ET GeoWizards > Convert > Polyline to Point on the _pnsplit_li to create midpoints for each of the str2k segments with unique FCodes. Use input from the previous step, select Middle Points conversion option, create output xxx_str2k_pnode_split_mid_pt

    28h) Use the context menu Joins and Relates > Join… to spatially join these midpoints to the NHD-conflated HE line features (input Step 27a) to obtain all relevant attributes, giving the joined points all attributes of the line closest. This should establish a set of features with an NHD Flowline-compliant schema.
    Select only those conflation-joined midpoints with short Distance to the joined feature, perhaps < 1m, and export selected points to xxx_str2k_conflate_mid_pt.
    Table join these conflate_mid_pt features to the full set of split_mid_pt features using the ET_ID attribute that they share, with Keep All Records.
    Turn off redundant fields such as [ET_X, ET_Y, ET_IDP, ET_M, secondary OBJECTID, secondary ET_ID, secondary HydroID and DrainID]. Save with HUC-themed name like b101_hydEdg2k_conf2_mid_pt

    28i) To include a vegetation flag in this midpoints Feature Class, add a short int field called “veg_gt_60” with a coded value YesNo domain; calc default values for all veg_gt_60 to No.
    Add three new Double fields, named “veg_100_Min”, “veg_100_Max”, “veg_100_Mean” that will be populated in Step 28.

    Use the NDVI_gt_60_pg to select midpoints that intersect, and set those “veg_gt_60” to Yes. This approximates the status of the reach as being vegetated or not, to allow subsequent filtering based on Shape_Length as a preliminary estimate of vegetated status for ephemeral reach protected status.

    28J) Turn off redundant fields in the xxx_str2k_conf_join_mid_pt Feature Class, saving HydroID, DrainID, ReachCode, FCode, FType_1, GNIS_ID, Name, GNIS_Name, GNIS_IS, ReachCode_14, Veg_GT_60, veg_100_Min, veg_100_Max, veg_100_Mean attributes; export to new Feature Class with only these attributes. Connect to xxx_str2k_pnode_split_li Feature Class with a spatial join including all attributes. Export the joined Feature Class to _str2k_conf0_li

    28k) Copy the Feature Datasets “Hydrography” and “WBD” from the download geodatabase into the working File Geodatabase. This will bring over a number of coded value domains.
    Create a template for USGS NHD Local schema, using a fresh download of nearby areas, exporting a single NHD flowline feature, then deleting values in most of the fields, setting others to useful defaults.
    To avoid vertical reference conflicts, Import the template flow lines into the working Feature Dataset, then ensure that its FCode field displays as a description and not as an integer. If not, use ArcTools > Data Management Tools > Domains > Assign Domain to Field to set FCode under “Marin_NHD_FCode_labels” domain. The Resolution attribute can be set = 1, for Local resolution. Save this as NHDH_template_flow_li Feature Class.

    28L) It is typically desirable to tune the attributes of this template before joining.
    (urbanized area note, including San Geronimo Creek) where there may be str2k flow lines more than 2.5 meters from an HE feature, default to FCode for storm ditch (33603) if within a Marin Adjusted Urban Area polygon, and an ephemeral stream/river (46007) if outside the urban area. Choose the most likely default value and set that in the template before joining.
    It may be desirable to set the FDate to now().
    When the template is set to desired default values, extend the schema of the networked flow lines xxx_edg2k_pnsplit_li from Step 27f by table joining the NHDH_template_flow_li feature on common field “Enabled” which is everywhere True.

    28m) After joining, turn off fields not necessary in the final flow lines such as [the joined OBJECTID, the second Enabled, any ET_ID, both Shape_Length], then export a copy of all features to name that includes some HUC identifier, such as b1o4_edg3k_pnsplit_conf2_li

    28n) Table join the conflated _mid_pt features to these _psplit_li features on HydroID.
    Field calc in values to GNIS_ID from joined GNIS_ID_NHDH; calc in values to GNIS_Name from joined GNIS_Name_NHDH; calc in values to ReachCode from joined ReachCode_14.

    Next, create a selection WHERE FCode_NHDH IS NOT NULL, and within this selection calc in values to FCode from joined FCode_NHDH; also calc in values to FType from joined (FCode_NHDH / 100). Remove All Joins from _psplit_li.

    Assign the coded-value domain to the FCode attribute in line with other NHD schema fields, calc in the ReachCode and FCode values that were conflated after pseudo-node splitting. Next, clean up redundant fields such as duplicated HydroID (after verifying they match), ReachCode, Shape_Length, and inactive ObjectID fields. Be certain to preserve Veg_GT_60, arcid, from_node, to_node, HydroID, GridID, NextDownID, DrainID, and the NHD Local schema attributes. The fastest way to remove multiple fields might be to view the Feature Class properties, Fields tab, and un-check the unwanted fields, then export to yet another Feature Class.

    The output of this sequence of steps should be a version of the _str2k_li modeled flow lines, with correct ArcHydro flow network attributes, conflated USGS NHD High resolution ReachCode and FCode attributes, and enough additional fields to contain the full NHD Local resolution flowline schema.

  29. Assign Vegetation Index values to features For this computation, temporarily turn off all attributes of the _str2k_li Feature Class except OBJECTID and HydroID, then export with a name like _str2k_veg0_li in the same Feature Dataset. Locate and add to Table of Contents the vegetation index raster, where Normalized Difference Vegetation Index (NDVI) floating-point values have been scaled to a range [0--100] for cartographic and other uses. One such image for Marin County is called U84_veg100_20130711f.img and is derived from a 1-meter NAIP 4-band uncompressed image of 2007.

    Use ArcTools > 3D Analyst Tools > Functional Surface > Interpolate Shape to drape the _veg0_li features over this vegetation intensity surface. Bilinear sampling Method is the only option, and because these edges have not been vertex-densified, do not check Interpolate Vertices Only. Save output with a name like _veg1_liz, and expect this step to be rapid for a single HUC-12 extent.

    Next use ArcTools > 3D Analyst Tools > Functional Surface > Add Surface Information on the _veg1_liz features, with the same “veg_100” raster as the surface.
    Save with Output Property including [Z_MIN, Z_MAX, Z_MEAN] to retain options when identifying vegetation state along ephemeral reaches. The information will be added as attributes to your _veg1_li features.

    In the _str2k_li features output from Step 27 where three Double attributes were added, table join the _veg1_li features using OBJECTID. Next, calc values from the joined Z_Min into veg_100_Min; calc values from joined Z_Max into veg_100_Max; calc values from joined Z_Mean into veg_100_Mean. Remove all joins from the _str2k_li features.

    The vegetation index can be set to inform later flagging of reaches as protected based on 30-meter runs of riparian vegetation. At this point in the workflow it may be best to simply designate

    Drag these _str2k_li features as a Layer into the map document used before Step 27 and continue the workflow in that map document. Save and close the Step 27-28 map document.

    Along main streams, some very short flow lines might be generated, and in some cases the geometry is close enough to the main creek that the conflation process incorrectly assigns perennial flow to the very short tributary. As a cleanup, for flow lines generated from 1m gridding select from _str2k_li features with lengths specific to burrs,
    Shape_Length IN ( 1.5, 2.123 ) open an edit session and delete.

  30. Flow Line Refinement (1 of 3) Flowline Split This step splits the flow lines into nominal 10-meter segments to allow for detailed transcription of the catchment area into each of the split segments. This information is used to help automate the flow regime classification, and much of the complexity of these splits can be dissolved on stream segment and flow class out after an acceptable flow regime has been defined.

    Using ET GeoWizards Polyline > Split Polyline provide the Step 28 _str2k_li features as input, and consider an output name at the File GDB root level of xxhe1_str2k_10m_li to suggest the 10-meter split. In the Split Polyline Wizard dialog, click Next. Select split method radio button for Segments length, enter 10 in the box next to METERS, and use Equal length radio button for Distribution of the length. Click Next and Finish.

    QA: Depending on the mean length of the flow line segments prior to splitting, expect more than 5X the number of segments to result after splitting. Verify this by comparing segment count between the input flow lines and these split ones.

  31. Flow Line Refinement (2 of 3) Catchment Attribution This step spatially extracts key statistics from the flow_accumulation grid he1_facc generated in ArcHydro step 3 into the split flowline segments. Close examination of the flow line features should reveal that their vertices are cell-centered, so no dilation of the flow_accumulation grid should be necessary.

    When using precise, undilated hydro-enforcement grid, it might be necessary to dilate the flow accumulation grid to ensure that all vertices of flow line features receive accurate accumulation values. If so, then use Spatial Analyst Tools > Neighborhood > Focal Statistics with he1_facc as input, MAXIMUM as statistics type, and consider a named he1_facc3x3 as output. Be sure to check the Ignore NoData box to support the dilation of flow accumulation line widths.

    Alternately, ERDAS Imagine can be used from Main menu > Image Interpreter > Spatial Enhancement... > Focal Analysis using a Focal Definition kenel of Size 3x3, and unchecking the NEly, SEly, SWly, and NWly corners to leave a ‘+’-shaped convolution kernel. Use the Max function to dilate maximum flow accumulation values along each path.

    Using 3D Analyst Tools > Functional Surface > Interpolate Shape with outptut into the Feature Dataset and name like _10m_liz
    Consider the option of Interpolate Vertices Only to keep all segments as accurate as possible when transcribing flow accumulation.
    Follow with 3D Analyst Tools > Functional Surface > Add Surface Information on the Input Feature Class of xxhe1_str2k_10m_liz prepared above, carefully select the current he1_facc flow_accumulation grid as the Input Surface. Check several Output Property boxes to help allow for more flexibility when automating flow regime classification. At various times, Marin NHD has found uses for Z_MIN (the upstream catchment limit), Z_MAX (downstream catchment), and Z_MEAN.

    (updated 2013 05 07) For surface information Method, rather than LINEAR (which will attenuate some flow accumulation values) choose CONFLATE_ZMAX to save the largest value---which is most important for this attribution. (10.1 Only)

    Use default sampling distance, and NO_FILTER for Noise Filtering (since that only affects slope calculations). No output Feature Class is required, since the surface information attributes will be appended to the input Feature Class.

    It is worth noting that at confluence with much larger streams, the bottom-most flow accumulation cell will have a much larger catchment (flow accumulation area) value, and as such it is typically best to perform flow regime classification on the upstream catchment area (Z_MIN values) of each 10-meter flow line segment.

    QA: At this point, verify that none of the newly-assigned Z_Min values are null. More detailed review will follow in a later step.

  32. Flow Line Refinement (3 of 3) Catchment Classifications For display purposes when working with the detailed flow lines, it is sometimes desirable to reduce density with a definition query---only using flow lines with catchments greater than a certain amount. Also, the display of catchment-attributed flow lines can be graphically enhanced with use of wider lines for larger catchment-draining reaches. Again, it is valuable to have a terse integer attribute to summarize the precise floating-point Z_MIN values. The graphic display works best if this integer field has been indexed in the File GDB.

    So, outside of an edit session, Open Attribute Table of xxhe1_str2k_10m_li and Add Field for:
    - a double-precision field named Acres, and
    - a Long Integer field named acre_class.

    Into the Acres field, calculate:

    Acres = [Z_Min] / 4046.86

    The Z_min choice will correctly set catchment for minor tributaries draining into much larger streams. Otherwise, the downstream end of the trib will show the flow accumulation at its outlet, so that neither Z_Max nor Z_Mean would be an accurate value.

    For the larger flow lines,there might be issues with both Z_Min and Z_Mean where a diagonal step has been sampled, and the Z_Min can have barely half of the true flow accumulation value. So after setting Acres from Z_Min, which is by far the most common correct value, seek out the main channels with a selection where

    SELECT WHERE (Z_Max / Z_Min < 2.2) AND ( Z_Max - Z_Min > 10000 )

    And for these segments, calc in the value Acres = [Z_Max / 4046.86 ]

    Once the floating-point Acres have been populated, it is time to calculate the integer index that will support more rapid symbology. With no selection set, into all acre_class, first calculate in ‘0’.

    For the following set of values, open the Select by Attributes and, for the first selection only chose create a new selection, keep open the Select by Attributes dialog (Apply), running
    Select from current selection after the first selection. Use Field Calculator for each new selection to add the acre_class values on the selection set.

    SELECT where acre_class
    “Acres” >= 1 1
    “Acres” >= 2 2
    “Acres” >= 3 3
    “Acres” >= 4 4
    “Acres” >= 5 5
    “Acres” >= 7.5 7
    “Acres” >= 10 10
    “Acres” >= 15 15
    “Acres” >= 20 20
    “Acres” >= 25 25
    “Acres” >= 30 30
    “Acres” >= 40 40
    “Acres” >= 50 50
    “Acres” >= 75 75
    “Acres” >= 100 100
    “Acres” >= 150 150
    “Acres” >= 200 200
    “Acres” >= 300 300
    “Acres” >= 400 400
    “Acres” >= 500 500
    “Acres” >= 750 750
    “Acres” >= 1000 1000
    “Acres” >= 2000 2000
    “Acres” >= 3000 3000
    “Acres” >= 4000 4000
    “Acres” >= 5000 5000
    “Acres” >= 7500 7500
    “Acres” >= 10000 10000
    “Acres” >= 20000 20000
    “Acres” >= 30000 30000
    “Acres” >= 40000 40000
    “Acres” >= 50000 50000
    “Acres” >= 75000 75000
    “Acres” >= 100000 100000
    “Acres” >= 150000 150000

    QA: verify that none of the newly-assigned Acres or acre_class values are null.
    With color and feature width symbology assigned to sketch flow regime, it becomes easier to verify that reasonable values have been assigned. When the symbology has been set to a useful appearance, it is worthwhile saving that as a Layer file that can be applied to other versions of the flow lines under review.

    In this shot, perennial-like catchments are blue, intermittent-like are green, ephemeral-like are brown, and those less than 1 acre are gray. Increasing values of acre_class are used to vary the width of the features across all colors.

  33. FCode in-fill from catchment-derived estimates The largest streams in the analysis shed will have FCodes brought over by conflation with NHD High Res features, and this is likely to include most perennial streams. The remainder of model-derived streams will need an estimated value. A consistent estimate of flow regime can be calculated from flow accumulation to provide a reasonable initial estimate for streams in the analysis; always adjustable to evidence from field observation, a catchment-based estimate can provide the most even-handed start.

    Ensure that no Definition Query remains to hide any features from the Feature Class just populated with acre_class. Select all features where “FCode IS NULL”. This should select mostly ephemerals generated above any hydro-enforcement features that carried conflated FCodes. If a default ephemeral ‘46007’ was already calc’ed, use the more robust selection ReachCode IN ( NULL, ‘’ ) to find segments without conflated ReachCode values. Into these selected features, calc a value in to FCode of “46007” to default them to “stream/river ephemeral flow”.

    Next, ensure that Select by Attributes is set to use Method “Select from current selection”, and select features where acre_class is above the analysis shed’s threshold for the ephemeral to intermittent transition. In San Geronimo Creek vicinity, that threshold is around 30 acres, so Select from current selection, “acre_class >= 30” is used. More precise definitions of default values can be selected such as “Acres >= 25.5” if needed.

    Examine any existing FCode for the selected features, and consider removing anything already classified as pipe, ditch, artificial path, or already intermittent flow; also consider removing any features already classified as perennial flow unless trimming is a goal. Carefully QA the selected features to deselect outliers; calc in the value ‘46003’ for intermittent flow.

    For cleanest results, take time to graphically remove from the selection any outlier features that may have scavenged higher catchment attributes. Depending on the texture of the flow accumulation surface, an extra 15% of features may be scattered around watersheds that are not part of a coherent downstream run. Use Selection > Interactive Selection Method > Remove From Current Selection and a rectangle selection tool to inspect the features. To improve drawing speed, consider displaying a copy of the _10m_li layer that has all features drawn with a simple 1-point line rather than the tapering display.

    Repeat the Select from Current Selection with an appropriate threshold from intermittent to perennial flow frequency. In San Geronimo Creek vicinity, that default threshold is about 150 acres; several spring-fed creeks have been observed with perennial flow at lower thresholds. Graphically verify that outliers within intermittent flow regime or minor spurs have been selected; calc in the value ‘46006’ for perennial flow.

    Graphically display the features with FCode then select spurs along perennial and intermittent features that were mis-classified at higher flow frequency. Typically all these minor features should have ephemeral flow, so calc in “46007” to the features. It might be efficient to build up a large selection set with ArcMap > Selection > Interactive Selection Method > Add to Current Selection and the rectangle selection tool, then calc many features at once to ephemeral flow.

  34. Add Extended ReachCode field and populate For performance’s sake, pause the display. At this point, certain flow lines will have ReachCode conflated across from NHD High, and most will not have any ReachCode yet assigned. It is suggested to create a specific new field for NHD Local ReachCode that adds 4 text characters to support a leading HUC-12 code followed by six digits, rather than the typical HUC-8 plus six digits used through NHD High Resolution.

    So Add Field with name like “ReachCode18” with text(18) type, and leave all values NULL for the moment. We won’t be able to fit existing ReachCode attributes into this field, so they will be maintained as a seprate attribute.

    Create a new selection where ReachCode NOT IN ( NULL, ‘’ ) and into the selected features’ new ReachCode18 calc in the value using your HUC-12 and the rightmost 6 digits from all existing ReachCodes. For San Geronimo-Lagunitas, this is like

    Using the 1m-grid-precise HUC12 features, select a single HUC12 and use Selection > Select by Location with Use Selected Features checked to identify those _10m_liz segments within a single HUC12 extent. Substitute the correct Hydrologic Unit 12-digit identifier for the “180500050103” below and sort to identify the lowest used OBJECTID, then subtract one less than that to generate ReachCode18 with an expression in the following form. Using the Python parser, calc in

    ‘180500050103’ + (str( !OBJECTID! - 231260).zfill(6))

  35. Catchment Refinement - 3k m2 Catchments Because buffering around the analysis area yields catchments draining outward from the analyzed area around the perimeter.
    << this step may not be necessary if the analysis_shed was very closely clipped >>

    First, outside of an edit session use the xxhe1_cat_m50cm_pg to select features in xxhe1_cat3k_pg that intersect the 50 centimeter reduced catchment.

    In ArcMap’s top Selection Menu > Select By Location, use Selection method of “select features from.” For the Target layer check the checkbox next to the 2k catchment polygons; for the Source Layer, use the polygon produced by the analyzed area that was aggregated from the 100k catchment polygons and buffered by -50cm. For method, use “intersect the source layer feature” for the fastest selection. Do this outside of an edit session to run faster.

    Switch selection to identify the extraneous 2k catchments.

    In several instances, the complexity of the spatial selection leads ArcMap to make errors in the selection, so before the edit session is opened, it is worth visually reviewing the selection to make certain that catchments within the analysis area are not still selected; setting ArcMap’s Selection > Interactive Selection Method > Remove From Current Selection can prove very helpful in correcting these within the analysis area.

    Open an edit session, delete the outside catchments, and then clean up any external ones that remain before closing the edit session.


  36. WBD HUC-16 Generation from 1 km2 and 100k m2 Catchments Generate some 1,000,000 square meter catchments using ArcHydro steps 4--7 with a 1.0 square kilometer stream definition, segmentation, catchment grid, and catchment polygons. Use these coarser areas to help identify natural drainage preferences for HUC-14 splitting of the HUC-12 and make it easier to select smaller catchments efficiently for merging.

    If using WBD procedures for Standard and Remnant Area option, consider using existing artifacts such as County Stream layer to identify mapped tributaries that might meet HUC-16 criteria as standard hydrologic units bounding HUC-14 streams.

    Before aggregating to HUC-14, it is necessary to merge the remnant 1-square-meter areas that might be connected diagonally. Consider the approach in an Edit session of selecting all identified HUC-16 then use the Explode Multipart Feature tool from the Advanced Editing toolbar. This may create hundreds of 1-square-meter polygons that can be merged with adjacent HUC-16 areas.

  37. Aggregate WBD HUC-16 Generation from 100k m2 Catchments - With 100k catchments cleaned up (near Step 25) to only represent the analysis_shed area, use them with their associated Drain Point features to build up HUC-14 areas that divide the HUC-12 into between five and 15 sub-areas.

    Efficiency will be greatly served by editing from a copy of the 100k catchments and labeling them with 1000’s of square meters. Targeting a display value of 1100 will get close to 300 acres. The display can look something like the following. Note that additional efficiency can be gained by grouping each processed area, then unioning all available prototype HU16 areas with a No Gaps Allowed option. Select all resulting features in an edit session and explode to ensure that no multi-part features have been included. The labeling shown below can highlight fragment areas, usually displayed with a ‘0’ value. Boundaries between adjacent processing extents will likely have thousands of fragments along their common divide. These can be very rapidly selected and aggregated by rectangle-selecting the longest common border, shift-deselecting all but the catchment with that border, and merging all (sometimes hundreds) of fragment areas into that catchment. It is possible to simplify 9000 such fragments in 90 minutes.

    The size distribution should probably look somewhat like this. Note that the mean is about 1,140,000 square meters, and the minimum 220,000 square meters.

    Similar to the cleanup procedure near Step 25, working with a copy of the 100k catchments, select a number of contiguous 100k catchment features that define a desired sub-area, identify the downstream-most selected segment, and merge preserving the downstream-most attributes. It can sometimes be helpful to have both the 100k and 2k flow lines available to decode how adjacent segments are draining. If available, use legacy county creek mapping to inform the level of detail, or granularity, of the splits. Most of the Standard areas should have a natural shape around the head and a narrow outlet, while Remnant areas will tend to have a more spiky appearance as they fill in between Standard areas.

    If any pre-existing catchment areas have been defined (for example, by a watershed conservation or management group, or areas for which studies have been contracted), make every effort to identify those areas as a set of 100k catchments and preserve them as distinct HUC-16 areas, or HUC-14 areas if their larger size fits that better. The collection of new HUC-16 areas must completely fill the HUC-12 area, forming a set that could be aggregated to produce the new HUC-12 sub-watershed defined by the present analysis.

    When aggregating, freely cut the upstream end to add a small notch below confluences. Often the best choice for a HUC-16 upstream limit will be at a confluence, and there should only be a single outlet from each HU; make it so at the scale of < 5m.

    Once the complete set of prototype HUC-16 have been unioned and cleaned up, convert the result to boundary lines and proceed with smoothing.

    Attaining a balance of splits versus combines with 100k catchment aggregation is likely to seem daunting at first. Just focusing on stream detail will lead to excessive splits, while only thinking about finishing the work can produce too much aggregation for HUC-16 splits.

    One strategy to become productive in this would be to transcribe legacy streams drawn with nominal 100-acre catchment is into an associated HUC-16 for every distinct strand. This can quickly guide one toward rapid selection of the 100k m2 catchments from where the tributary branches off the main stem, up to and around its head.

    In an edit session open on the polygon Feature Class into which the HUC-16 will be formed, select one Standard area’s worth of 100k m2 catchments. Then Copy and Paste these polygons into the new HUC-16 Feature Class, and while they are remain selected after the Paste, immediately use Editor > Merge… and accept the default feature to merge into. Ideally, by selecting the downstream-most of the 100k catchments first, one can save the GridID of the lowest and later use its outlet point---but to get productive, it’s best to ignore the attributes at first and instead focus on the aggregated geometry. Be certain to Clear Selected Features after the merge, so that subsequent Copy and Paste will work cleanly from the 100k catchments only, and not produce failure dialogs.

    After completing the analysis_shed’s worth of HUC-16, celebrate briefly then return at once to evaluate the effort. Symbolize the areas with some nearly opaque fill and look closely for holes in your HUC-16, as situations exist at stream confluences where 100k catchments can be generated with areas as small as 3 m2, and frequently less than 100 m2. Just select the missing 100k’s, Copy, Paste, Clear Selected Features, turn off the 100k catchment layer, select the paste-in and surrounding (or adjacent) HUC-16, then Merge. Systematically review the entire analysis_shed for gaps and repair them in this way.

    Then, after a longer break, preferably overnight, review the HUC-16 for desired level of detail with a fresh eye. If you are transcribing legacy streams, verify that all are represented with HUC-16. Consider whether similarly detailed tributaries were missing in legacy streams (this can be common) and split out new HUC-16 as appropriate. At this point, you’ll find that it takes just a few seconds to discard one HUC-16 and re-compose it from the 100k m2 catchments, so try out some different scenarios if you have doubts about a given HUC-16. The Standard areas should probably have a natural shape to them, although in flatter areas either graded highways or ranch perimeter roads may yield some very linear divides.

    After verifying that no holes remain, rationalize grid-edge HUC-16 boundary lines in an Edit session by first noting in the attribute table how many polygons have been created. Select all polygons, then from the Advanced Editing toolbar’s Explode Multipart Features button or ArcTools > Data Management Tools > Features > Multipart to Singlepart to convert them to simple polygons. Frequently the 100k m2 catchments express outlets with a diagonal train of linked pixels, and when converted to polygons, these pixels are outlined and merged into an aggregate multipart polygon.

    QA for duplicated large polygons by summarizing Shape_Area and examining the summary table for any polygons with large area and have a count of 2 or higher. Delete identical HUC-16 duplicates. Next, critically examine the summary table and note that there is likely a jump in areas from 100,000 m2 down to 200 m2 or less, and a fairly large number of 1m2 or 2m2 polygons that need to be merged. One strategy for this task is to open attribute table and use Advanced Sorting to sort on ascending Shape_Area and next on GridID to deal with all a given shape’s 1m2 outliers at once. It can be informative to keep the str100k_li features visible to decode why the diagonal pixels were built in the 100k m2 catchment. In a few cases of excruciating detail, it may be necessary to cut polygons diagonally at corners of 1m2 blocks to merge them cleanly. SIngle-pixel outliers at the boundary of the analysis_shed can either be checked against adjacent analysis_shed if such already exist, or just deleted.

    As a practical editing note, when working through ~200 single m2 polygons in need of merging, your editing scale may be near 1:50, and it is more efficient to select a row of Shape_Area 1 in the table, right-click and Pan To the next offending polygon, which saves a zoom out each time as needed by use of Zoom To Selected Feature. Small but important in a repetitive situation.

    Also, at stream confluences, be very clear on which polygons are in need of cutting, as it’s usually the receiving catchment that should be cut and aggregated with the narrow outlet of the tributary. So select the receiving catchment, and use Cut Polygons Tool being certain to start outside the selected polygon, traverse the cut, and finish outside the polygon.

    Because diagonal trains of 1m2 polygons often are built together, manual repair of catchments at a junction can resolve between 1 and 50 such polygons at once, so even if there are 300 micro-polygons created after the Explode Multipart action, rationalizing them all may take about 90 minutes of editing effort.


  38. Prepare smoothed boundary HUC-16 polygons - Convert the boundaries of the HUC-16 to lines so that they can be smoothed to conform with the WBD Standards section 4.3.4 “Delineation Using High-Resolution Base Products”. Use ArcGIS Tools > Data Management Tools > Features > Feature to Line (not Polygon to Line, which will duplicate common boundaries for adjacent polygons). There is no need to check Preserve Attributes.

    Perform the requisite smoothing using ArcGIS Tools > Cartography Tools > Generalization > Smooth Line on these boundary lines. It appears fine to use PAEK smoothing, and for a 1-meter grid, consider using a Smoothing Tolerance of 5 meters to mask most grid artifacts remaining from terrain modeling. Do Preservere endpoint for closed lines. Flag topology errors on the (unlikely) chance that lines cross.

    After smoothing, reduce the number of boundary vertices using ArcGIS Tools > Cartography Tools > Simplify Line with Simplification Algorithm of POINT_REMOVE and Maximum Allowable Offset of 0.35 meters for drainages derived from a 1-meter grid. Check and Resolve topologcial errors.

    QA for tiny loops generated during boundary smoothing and simplification by opening attribute table and critically examining line segments with Shape_Length near one circumference of a grid cell (4m for 1m gridding). Most of these can be deleted.

    Each HUC-16 should be provisioned with one label point near its outlet. Ideally, the point will not be so close to the outlet (say, less than 1 meter) that it can’t be recognized from smaller scale. The exercise of clarifying the label points serves as opportunity for QC on the HUC-16 themselves. For example, there must be only one outlet, never more. In many cases it may be necessary to transcribe a divide from the 100k m2 catchments to partition an over-aggregated HUC16. As a practical matter, it is helpful to maintain the PAEK-smoothed and simplified boundary lines, add edits only to those, then build new HUC-16 as corrections are noted.


  39. Resolve boundaries with adjacent Local Resolution Hydrologic Units - There should be no gaps between HU generated by this workflow, and yet after completely separate runs of hydrologic analyses, smoothing, and simplification, it is common to find discrepancies on the order of grid cell size (1m--2m) on common divides. Larger gaps of 10s of meters might appear on smooth, near-planar slopes. If a set of adjaecent HU have been generated, consider working around the current analysis_shed and and paste in a (smoothed and simplified) Local Resolution HUC-10 polygon to the smoothed-and-simplified HUC-16 lines just generated. Then, in an edit session use appropriate portions of that geometry to conflate the boundaries. If the adjacent analysis_shed has been generated to a comparable level of confidence, consider replacing redundant boundary lines from the current analysis_shed with the previously generated HUC-10 boundary. Identify and resolve any discrepancies that are much larger than one or two grid cell sides.

    When adjacent boundaries are conflated and the line segments shorter than ~10m have all been examined catchment details at confluences have been resolved, build the HUC-16 using ArcTools > Data Management Tools > Features > Feature to Polygon, without label points at this time, so no need to Preserve attributes. Include something like “HUC16” in the name of the output Feature Class.


  40. Aggregate HUC-16 to HUC-14 extent - After the smoothed HUC-16 have been QC’ed and provisioned with label points, make a copy of these new polygons and include “HUC14” in its name. These will be merged to produce coincident HUC-14.

    Following guidelines from Federal Standards and Procedures for the National Watershed Boundary Dataset (WBD) Fourth Edition, 2013, number the new HUC-14 areas starting upstream, numbering sequentially downstream. Do not skip numbers. Keep the Drain Point features corresponding to the outlet of each HUC-14 area and discard others when aggregating the 100k catchments. Keep the HydroID associated with the downstream-most 100k catchment, and when a set of between 5 and 15 HUC-14 have been defined, rationalize the NextDownID based on flow between the HUC-14. This might best be done outside of an edit session, copying and calc’ing in the values to NextDownID of selected HUC-14, to avoid any risk of moving the polygons.

    Where it can be helpful to inform HUC-16 creation with reference to str_100k_li flowlines, for the HUC-14 it may prove useful to reference str_1MM_li 1-km2 catchment initiation flowlines. If this workflow was followed precisely, it may be necessary to generate these flowlines at this step. Large-catchment stream definitions will be very quick to generate.

    Preserve the attributes of the new HUC-14 with ArcGIS Tools > Data Management Tools > Features > Feature to Point with the HUC-14 as input, and check the Inside box.

    QA each of the drain points of the new HUC-14 to ensure that it remains within the HUC-14, and snapped to this newly-smoothed geometry for str2k_paek3m_li flow lines. The Drain Point feature should be snapped to the line within a couple of meters of the downstream outlet of each HUC-14.

    Build the HUC-14 from the smoothed boundary lines and associated label points with ArcGIS Tools > Data Management Tools > Features > Feature To Polygon.

    Ensure that the AreaSqKm is populated with

    AreaSqKm = [Shape_Area] / 1000000

    and that AreaAcres is populated with

    AreaAcres = INT( [Shape_Area] / 40.4686 ) / 100

  41. Build ArcHydro Network from PAEK-smoothed str3k lines Take a version of str2k_li that have successfully built an network, and smooth them with PAEK at Smoothing Tolerance of 4 meters to reduce grid artifacts, using ArcGIS Tools > Cartography Tools > Generalization > Smooth Lines. Next, simplify using ArcGIS Tools > Cartography Tools > Generalization > Simplify Line. Build the network from the PAEK-smoothed and simplified lines, the original str2k Drain Point features, and the un-smoothed cat2k catchment polygons. Be patient, as with stream segments in excess of 10,000, this can take around an hour to build.

    These features will have the key ArcHydro attributes, plus FType, FlowDir, EdgeType, and Enabled; consider outputs such as xxx_edg2k_li, xxx_jct2k_li


  42. Extend Schema of Networked str2k flow lines Identify a very recent download of NHD High Resolution for the study area. Select a single flow segment near the drain of the analysis_shed that is within the shed, and export this to the “Hydrography” Feature Dataset that contains the downloaded flow lines; Save this feature as NHDH_template_flow_li.

    To avoid conflicts with vertical resolution sometimes experienced with Exporting, instead Import this single-feature Feature Class into the Feature Dataset that contains the smoothed str2k flow line network just built.
    Open the attribute table of this feature. Prepare it for cloning by setting
    Permanent_Identifier to “” (the empty string),
    FDate to now() to calc in today’s date and time,
    Resolution to ‘1’, the coded value for “Local” resolution,
    GNIS_ID to NULL, note this is string(10) type rather than Long Int as in GNIS itself,
    GNIS_Name to NULL,
    LengthKM to 0
    FlowDir can remain “WithDigitized”
    WBArea_Permanent_Identifier can remain NULL
    FType to 0
    FCode to NULL
    and Delete Field on the Enabled field; this already exists in the network features

    When prepared, clone the schema by joining it to the network str2k edges; simply exploit the integer 1 in the Enabled coded value domain and Join attributes from a table to the OBJECTID of the single-feature NHD High flow line Feature Class template; Keep all records.
    Turn off fields for the joined OBJECTID (where all values = 1) and the trailing Shape_Length where all values are the same. Export the joined Feature Class to a new extended one; consider a name like xxx_edg2k2_li

    Prepare the network edges to be segmented by placing an identifier before they are split.
    UUID can be generated with a brief Python script. In the context menu on the Permanent_Identifier attribute column heading, invoke Field Calculator with the Parser for Python, and check Show Codeblock. In the Pre-Logic Script Code paste

    def ID():
    import uuid
    return '{' + str(uuid.uuid4()) + '}'

    and in the Permanent_Identifier = block below, paste


    This should fill the fields with conforming unique identifiers. Update FDate with a new if desired.

    Carefully compare the analysis_shed’s extract of features from USGS NHD High Resolution flow lines. Some of these features may have non-NULL GNIS_ID and GNIS_Name values.
    Examine USGS GNIS Points with hydrographic feature TYPE [Bay, Dam, Falls, Lake, Spring, Stream]. Some of these may identify GNIS names for features that have not yet been represented as their associated NHD Flowlines, even if those flow lines exist in NHD High Resolution.
    If your work is being done in US National Grid / UTM meters, udpate the LengthKM field by calc’ing in (Shape_Length / 1000)

    While spatial join might be an option, it could be more efficient to use the new HUC-14 catchments for the analysis shed, and step through each of them performing Select by Location in turn, then copying the HUC-14’s TMNID (The National Map unique ID), and calc’ing that value into the WBArea_Permanent_Identifier for all selected network edges within that HUC-14 extent.

  43. First-stage generalization - dissolve flowlines to simplify but preserve acre_class A dissolve will be important to reduce the total number of psudo-node splits imposed on the network edges, while preserving distinct acre_class segments within each unique HydroID reach.

    Verify that all selections from QA review have been cleared.

    With ArcGIS Tools > Data Management Tools > Generalization > Dissolve identify your xxhe1_str2k_10m_li features that have completed the ReachCode and FCode conflation process as Input, select an output within the Feature Dataset used for non-ET GeoWizards features, and consider an output name like xx1_str2k_pdp_diss0_li .

    Manually check the Dissolve_Field(s) for HydroID and acre_class.
    For Dissolve_Field(s), use:

    FDate (Becaue it doesn’t seem to show up as a selectable field for Statistics)

    For Statistics Field(s) use:

    NextDownID FIRST
    DrainID FIRST
    Permanent_Identifier FIRST
    Resolution FIRST
    LengthKM FIRST (to avoid having field named SUM_LengthKM, we’ll calc it later)
    WBArea_Permanent_Identifier FIRST
    FType FIRST
    Acres FIRST
    HUC12_ReachCode FIRST

    Uncheck the option Create multipart features.

    Depending on the complexity of the features, this should take part of a minute. This may reduce the count of segments to about 30% of the 10-meter split version, and draw about 3X faster on the reviewer’s screen.

    43a) QA 1 of 2: First, symbolize with taper on Acre_class, graphically identify and dissolve across anomalous doubling of flow_accumulation (Acres). The anomaly will appear as a step up to a higher catchment, followed by a drop back to a more consistent value. In detail, the Acres attribute will be about double the expected value in these anomalous reaches. How this happens is not clear at the moment, but there may be some issue with the algorithm used by Add Surface Information.

    43b) QA 2 of 2: Second, symbolize with FCode and visually inspect for transitions caused by this same problem. Typically, it is possible that some isolated reaches are assigned intermittent flow with Ephemeral above and below. Also, in some places, the perennial attribute is conflated to short ephemerals that are running parallel to a (much) larger channel. Edit fixes into these.

    43c) For geometric compatibility, smooth the dissolved features with ArcTools > Cartography Tools > Generalization > Smooth Line with PAEK Smoothing Algorithm and 3m of Smoothing Tolerance; call this xxx_str2k_psp_diss1_li

    43d) Run ET GeoWizards > Convert > Polyline to Point on xxx_str2k_psp_diss1_li with Nodes as the option and Remove Duplicate Points checked. This will identify junctions and pseudo nodes where the dissolved attributes change; call this output xxx_str2k_pspd1_node_pt

    43e) To eliminate node points at junctions and uphill end, Select by Location with Target _node_pt layer from the previous step, and Source layer from the hydro net junctions, with Spatial selection “within a distance of the source layer feature”, and 0.01 meters distance. These are existing endpoints within the flow network. Switch Selection to identify only the pseudo nodes, and export selected features to xxx_str2k_pspd1_pnode_pt

    43f) Globally reduce segment fragmentation with Select by Location with Target _pnode_pt from the previous step, Source layer from the hydro net junctions, within a distance of 2.5 meters from net junctions. Switch Selection to identify the refined selection of pseudo nodes.

    43g) Use ET GeoWizards > Point > Global Snap Points to move the pnodes across to the very similar, but usually different smoothing in the edges. Take the points from step 36d) as the layer to snap, the networked smooth str2k edges as the layer to snap to; consider an output like xxx_str2k_pnode_split_pt and a snap tolerance of ~2.5m. In Snap options, select Nearest edge. With ~10,000 points to snap, this might take a couple of minutes.

    43h) Use ET GeoWizards > Polyline > Split Polyline with Layer, using the split points and a search tolerance of 0.1 meter; save output as xxx_edg2k_pnode_split_li

    43i) use ET GeoWizards > Convert > Polyline to Point on the pnode_split_li to create midpoints for each of the str2k segments with unique FCodes. Use input from the previous step, select Middle Points conversion option, create output xxx_edg2k_pnode_split_mid_pt

    43J) spatially join these midpoints to the NHD-conflated HE features to obtain all relevant attributes; export the joined points to xxx_edg2k_conf_join_mid_pt. Verify that “distance” to the joined feature is very small, probably < 1m is an important filter.

    When copying the flow lines into a File GDB for reviewers, take a moment to index the acre_class attribute. In ArcGIS Catalog tab, locate, select and right-click the feature class in the reviewer’s File GDB copy > Properties . In its Indexes tab, Add an index named something like “idx_ac_cl” for indexed acre_class, select acre_class from Fields available, and click the right-arrow to add just that one attribute to the Fields selected: column; OK to apply the index.

  44. BACKUP completed study area Using the GIS lab Drobo device (network-addressable storage or NAS), either move a full copy of the work area directory to the “Marin_NHD_2014” folder, or if this work unit’s processing was done on the NAS, then copy onto the external (E:/) drive “Marin_NHD_2014” directory and to a second file server location on the Drobo (X:/) drive.

  45. Package the HUC-12 for review Clip the contours for the analysis_shed, be sure to import the contour metadata from the copy in X:\Marin_NHD_2013\_HE_Archive so that the package contour clip is properly documented.