You are here

Marin NHD Local Introductory Sections

Marin NHD Local

admin's picture

Marin NHD Hydrologic Flow Line Creation Procedure
version of 2014.07.04

Introduction I.
Background to the workflow

As part of its 2007 Countywide Plan, Marin County, California set out protections for streams with perennial and intermittent flow, as well as certain ephemeral-flow reaches that support enough riparian vegetation. An implementation ordinance for this protection limits certain types of development activity within a distance from these streams, and so it is necessary to overlay parcel and building footprint on polygons derived from streams---and thus streams must have positional accuracy appropriate for parcel and building footprint overlay, nominally 1:2400 map accuracy.
This accuracy is 5X to 10X larger scale than existing US Geological Survey (USGS) National Hydrography Dataset (NHD) High Resolution flow line features have been providing. New model-derived flow line geometry also forms a basis for refined USGS Watershed Boundary Dataset (WBD) geometry and splitting of Hydrologic Unit Code HUC-12 areas into HUC-16 extents, also at 1:2400 map accuracy.

Field observation of flow regime is expensive, so over the entire county surface drainage network these control points are sparse. Also, at 1:2400 scale smaller ephemeral reaches become mappable features. In complex terrain with many property owners, ephemeral reach density and lack of access makes it infeasible to rely on full field verification. Even in areas where access is possible and useful observations of riparian habitat are obtained, thick tree canopy can limit the accuracy of ground-based GPS observations that attempt to improve creek geometry.

This workflow was devised to accomplish the following points
Create countywide NHD Local Resolution (> 1:4800 map accuracy) flow lines
Leverage county watershed ARRA LiDAR (2 pulse/m2) for model-derived flow lines
Support use of field observations to refine model-derived flow line geometry
Leverage field observations of flow regime to calibrate nearby flow regime estimates
Integrate storm drain systems to map flow lines through urbanized areas
Refine WBD HUC-12 geometry, and produce WBD-certifiable HUC-16 splits
Generate catchment areas to associate with every outlet and outfall, countywide

While the ArcHydro extension to ArcGIS Desktop largely automates the generation of flow lines from a gridded surface, the Marin County workflow augments ArcHydro both before and afterward. Stream geometry through trees, storm drain system underground conduits, and some field-observed creek geometry refinements are crafted as hydrologic enforcement paths that are burned into the (typically 1m-gridded) terrain beforehand, to be respected as ArcHydro generates flow lines. Afterward, ArcHydro flow line features are augmented with attributes to preserve flow accumulation and conflated with existing NHD High Resolution ReachCode, FCode, and GNIS Name attributes.

ArcHydro is used to generate the network of flow lines below 2000 m2 (0.02 km2) catchment, and for WBD compliance, these network edges are smoothed to reduce some of their grid-derived artifacts. Each WBD sub-watershed (HUC-12) areas is given model-derived geometry from the (1m-gridded) model results, with boundary smoothed and split into HUC-16 areas each between 100 and 3000 acres.

The WBD HUC-12 extent of HUC-16 polygons, model-derived NHD Local Resolution flow lines and NHD-compliant flow network are packaged into an ESRI File Geodatabase along with a clip of 2-foot-interval contour lines as a review package. The package is distributed to sources with local knowledge of creeks, state WBD and NHD stewards.

Modifications to model-derived flow line geometry can be accommodated by edits to hydro-enforcement features as described below, and re-generation of model-derived flow network. Modifications to feature classification FCode can be accomplished with edits to the conflation set of hydro-enforcement features. After these edits, flow lines can be regenerated. Once loaded into the national repository, NHD Local Resolution features can be maintained for small edits using the existing USGS NHD Editor tools.

Introduction II.
Background to Marin County hydrologic enforcement features

This procedure is intended to help standardize the creation of hydrologic enforcement (HE) features that are used to improve the accuracy of model-generated surface water flow lines. The writeup was prepared following the development of HE features for Las Gallinas Creek in and near San Rafael, California. The HE features took more effort than expected to complete in this watershed, and the watershed’s variety of challenges make it an excellent case study to document HE feature creation in diverse conditions. Las Gallinas Creek drainage contains a bit of most all Marin County environments.

In detail, there are two sets of HE features created for the workflow. The first, described here, consists of long continuous uphill line features that chain together flow through storm system features and through wooded terrain, or through very-low-relief tidal channels. These HE features do not require attributes as their information is contained in geometry and direction; for good connectivity it is desirable that they snap together at vertices or at least edges where flow converges. Typically, the second set of HE features are a subset derived from the first, and they are used to represent existing NHD High Resolution flow line geometry in the larger-scale Local Resolution context. In the second set, only those HE features that overlie older NHD High Res features are used, and a copy of the first HE features is split at any transitions to convey existing GNIS_Name, FCode and ReachCode attributes from NHD High Res into the NHD Local Resolution features that are being generated with this workflow. The more highly split geometry of these attribute-transfer HE features makes them unsuitable for regular HE use.

The Las Gallinas drainage shed was analyzed within a 200m buffered extent around and including the pre-existing watershed polygon, on a topographic-bathymetric surface of 45 km2 (17 mi2) that contains several natural and built environments.

Easterly of most development, the northern side of San Pedro Mountain reaches 305m (1000 ft) above the tidal marshes in China Camp State Park. Deep and shallow bay waters are in the lowest section. Older tidal marshes are rich with tidal channels, some with minimal changes since California’s Gold Rush times, and tidal creek reaches pass through the middle section of the drainage. Fallow reclaimed land lies behind diked areas, as do an airstrip, a manufactured home development, and a single-family home development. The historic shoreline was reinforced for a single-track rail line now being upgraded for modern transit use. A reclaimed marsh was developed to become a light industrial park with extensive impermeable areas adjacent to an 8-to-10-lane freeway (US Highway 101) built with significant grading as it enters and exits the watershed.

In the upland areas, larger tracts were developed as a private school, a high school, a hospital, a cemetery, and the county’s largest shopping mall with surrounding strip mall development. Residential development filled the upper portion of the drainage starting in the 1940s and mostly in the 1950s and 1960s, leading to trapezoidal concrete straightened creek channelization on the low side, and extensive and sometimes informal horizontal drains across backyards on the high sides. Nearly all streets have either concrete curb gutters or concrete swales leading down to street drains that are piped into the channelized creek.

Residential development took place as unincorporated Marin County, with earliest land reclamation in Santa Venetia starting in 1914, and aggressive development during the 1950s and 1960s by several different builders, terracing hillslopes to create buildable space. Some of these tracts annexed to San Rafael in the 1970s, and paper records of those drainage systems have been transcribed into CAD features. Other areas are served by Marin County Flood Control and Water Conservation District and also have well-documented drainage systems. When available in a GIS system, these storm system features can be helpful guides to rapidly constructing HE features.

The author has worked with CAD and GIS for facilities management as part of environmental remediation projects, and has field experience locating underground utilities from paper drawings. Because federal facilities are not served by Underground Service Alert, environmental projects need staff who can locate underground pipes before drilling and other invasive sampling. Knowledge of how the various color paint marks are placed on the ground before digging helps one get full value from Google Street View images if manholes are marked at the time of Street View photography.

Introduction III.
Rules of 2D Hydrologic Enforcement (HE) features (Step 5 design principles)

The HE features are created for a specific purpose, and these rules are necessary so that the features perform their role of guiding 2D steepest-descent model flow along precisely selected paths. The principles emphasize issues important to application of ArcHydro to 2 pulse/m2 LiDAR-derived terrain in urban and forested areas of a California coastal county with significant estuarine shoreline. Is it easy? No! Yet do it right, and it’s just not that bad.

HE design rule I: Enforce stream channels through trees.
No matter how detailed our LiDAR-derived terrain might be, arboreal disruption (AD) of ground returns is common along streams. Steepest-descent surface flow modeling requires that we analyze a depressionless surface, so local minima are always filled prior to analysis. Thus, it is desirable to trace channels through trees with HE features so that modeled flow lines do not escape the channel upstream of AD blockages and wander some distance before returning to channel flow. It is far better to create a single continuous HE feature for the stream than to attempt to bridge each AD event individually, to avoid missed AD as well as to maintain the natural meander course along the modeled flow line.

HE design rule II: Accept that 3D flow features will become approximated.
In urbanized areas, storm system features may route flow in 3D ways that can not be fully expressed in a 2D model for ArcHydro. Certain workflow steps detailed below will show a good compromise that gives primacy to features likely to contain greater flow (underground pipes) over those flowing from smaller catchments (curb, gutter, or ditch flow). Commonly, if a storm pipe crosses under curb flow, the curb flow is connected to the pipe even if there is no street drain present. This approach will support precise modeling of larger-catchment flow at the cost of reduced accuracy for the smallest-catchment flows. Frequently, the resulting modeled flow lines will be displayed without lowest-catchment reaches, so this compromise will lead to the least-visible artifacts.

HE design rule III: There is naught but one way down

This rule segregates HE features from storm drain system CAD features.
Typically in built environments, horizontal drains are constructed behind entire blocks of houses across a cut slope, curbs run continuously from one cross street to another, and pipes may be connected with high-flow bypasses to increase system capacity by diverting some flow to an alternate drain as needed. Although these built distribution features can be represented in an hydraulic model, with steepest-descent hydrologic modeling we seek to identify unique pathways in the low-flow state; water either flows along a modeled path, or it accumulates to join the flow downstream.

In steepest-descent hydrologic modeling, we can not accommodate both low-flow state and diverted flow along a high-flow bypass. We may represent the bypass path, but we don’t connect it through the bypass structure, in order to model the low-flow state.

Horizontal drains are low-slope features built across cut slopes and frequently have several outlets. Between these outlets, there is usually a subtle divide where the drain has a highest point. Our HE Rule III requires that HE features are constructed separately from each outlet up toward the divide, splitting the horizontal drain’s flow where its grade changes direction. This technique will accurately represent the flow captured by the ditch---but CAD representations of the drain will typically treat it as a single continuous asset, the same way that it was built. For readability it may be desirable for HE features to have a gap of up to 5 meters at this divide, despite the ditch being built in the gap, that part intercepts relatively little flow. Also, divides on horizontal drains are often so subtle that there might be 5 meters uncertainty in the divide’s location. This gap is a compromise that will still capture nearly all the function of the horizontal drain,

Curbs and street gutters likewise are built as long single features, but often cross minor divides. When they do, it is necessary to create HE features upwards toward the divide from each side, leaving a gap around the summit. Depending on the condition of the street curb, the exact location of the divide may be muddled, and the gap’s size can reflect that uncertainty. Note that in many hillside areas, the divide may cross the street diagonally and gutters on each side of the street may need gaps that are not directly across from each other.

HE design rule IV: Create continuous HE features, to yield continuous flow lines

This is another rule that segregates HE features from storm drain system CAD features.

We create HE features to force modeled hydrologic flow lines to follow a precise path, and the most efficient way of doing this is to guarantee that the HE feature slopes down in the flow direction. This efficiency means that we care much more about the location and direction of flow than we care about the terrain elevation that flow is modeled on.

Each HE feature will get a starting and ending elevation that both flows in the proper direction and is everywhere below the pre-enforcement terrain grid surface. Effectively, the HE features should cut trenches in the grid to force flow. If HE features rise above the ground surface anywhere along their path, then captured flow will be released to follow a steepest descent path until it is intercepted again farther downstream. Fewer and more continuous HE features can pull lower elevations farther up into the network of HE features and provide more flexibility for tributary HE features to stay below ground surface while still dropping down into the main channel.

When HE features are created, it is more efficient to focus on the connections being created rather than the specific feature type (creek, pipe, ditch, swale) being traversed. Frequently HE features are traced over storm drain systems by identifying a particular sequence of outlet, manhole covers, street drains, curbs, and ditches. Best accuracy is obtained by following the evidence to define the path (CAD features, terrain, aerial photo, or Google StreetView). The longer and more continuous the path, the more reliably we can use simple formulas like in workflow Step 8 to quickly set Z values that successfully enforce modeled flow.

When features are extended after breaks in editing, either create new extended features and then merge them with the first section, or else keep the Edit Vertices toolbar parked so that there is easy access to the Extend Feature button---which continues the drawing of a selected feature onward from its uphill (red vertex) end.

HE design rule V: Mind The Gap -- Connect segments with snapping
After the HE features have been created they are converted to a closely-spaced train of points. Ensure that HE features function properly by connecting them together precisely.

Following the guidelines of Rule IV above, it is not desirable to break HE features where they join. Rather, for major HE junctions consider using or creating a vertex in the main path, then start drawing the tributary by snapping to that vertex and continuing uphill. For minor junctions, edge snapping should be sufficient to connect the smallest tributaries to the channel they feed into.

Since the storm drain CAD features are likely to be visible, it is important to avoid snapping to those if a more precise location (typically for street drain, manhole, or horizontal drain) has been created. For safest snapping, turn off the storm drain CAD features when not needed for reference while extending the HE feature geometry that represents them.

Use the fewest and least-reactive types of snapping when possible. When CAD features are turned off, Endpoint snapping is almost always useful. Vertex snapping can help quickly connect new HE tributaries at larger junctions. Turn on Edge snapping only when extending a curb drain, then turn it off right after creating the first vertex of the new line. For some types of feature cutting, Intersection snapping has proven useful.

HE design rule VI: Get It Done Right -- but get it done fast
The combination of LiDAR surface, 15cm orthoimagery, and Google Street View will typically resolve the location of these storm drain features more accurately than was possible at the time that the storm drain CAD features were created---so typically use the CAD features to identify that underground connectivity that should be represented, but derive its geometry from the surface, then photo, then StreetView.

The most powerful tool for rapidly developing HE features through forested and peri-urban areas is a fine-grained LiDAR-derived slope image. If the analysis terrain was developed using Natural Neighbor interpolation, then most streams with developed channels will have an abrupt break in slope along their length that sharply contrasts with surrounding slopes, effectively creating very thin, high-contrast features to be interpreted as HE break lines. Experience shows accuracy that might require tracing at 1:250--1:300 directly from the grayscale 50cm DEM can be accurately developed on 1:800--1:1250 screen scales when using the corresponding 50cm slope image, yielding HE features faster yet well-done.

Remain focused on the priorities of the project deliverable; main channels are likely always worthy of HE features, smaller ones progressively less so. Only in very flat subdivisions should it be necessary to delineate curb drainage. A key strength of this model-derived workflow is that too much HE feature generation will be trimmed down by the model, just as sparse HE features will be extended automatically to a consistently densified result.

Introduction IV.
Conflation of FCode and ReachCode from NHD High Resolution features to Local Resolution HE features (detail of Step 27 in Marin NHD workflow)

The Marin NHD Local Resolution HE features will typically match or exceed the density of flow lines present in NHD High Resolution. This allows all conflation of ReachCode and FCode attributes to be made from NHDH to NHD Local via attributes of appropriate HE feature geometry. In Marin NHD Local Resolution, approximately 6X the length of flow lines are generated for 1:4800 scale than are found at NHDH 1:24k scale.

To make the conflation process efficient, open the attribute tables of both the Marin NHD Local Resolution HE features and the NHD High Resolution Features. Park them with the Local Features above the NHD features. For each table freeze the ReachCode and then freeze the FCode so that they are adjacent. Enlarge these columns so that they are completely visible. Reduce the table size so that only those attributes are visible and park the table conveniently. Focus on the selected features.

Make both the local and NHD features selectable and the Interactive Selection Method set to Create New Selection. Open an edit session on the local HE features. Working up the watershed, select both the NHD feature and its corresponding local feature. Compare the lengths of each feature for correspondence. If they don’t correspond well, edit the local feature, merging or splitting the line appropriately until it approximates the NHD feature as appropriate. Use the photo, terrain and slope to help. Copy and paste both the ReachCode and the FCode from the NHD to the Local Resolution table. If the NHD has more than one ReachCode along a reach, propagate the first (lowest) ReachCode up to the next logical junction.

If it is apparent within a Reach that different FCodes are appropriate, (e.g. an Artificial Path through a tidal area, pond or reservoir, or an underground storm drain,) split the Local feature at the junctions and select the different length and give it the correct FCode.

Rules to observe when conflating NHDH to NHD Local HE features:

A. The ReachCode must capture the correspondence between reaches across a scale gap that is sometimes substantial; what is a simple fairly straight line in NHDH might contain many meanders and minor tributaries, even in HE features, at NHD Local. Conflation must account for both complexity of shape and often horizontal offsets---yet still identify the same feature across both NHDH and NHD Local resolutions.

B. The FCode should accurately transcribe the feature’s FCode from NHDH, and this feature will only be a valid transcription if there is a corresponding ReachCode for that feature. Both FCode and ReachCode must be copied over. Where mapping scale and fineness of LiDAR-derived surface permits proper discrimination of features such as culverts beneath streets, or flood control ditches, the most appropriate FCode should be applied to the specific run identified as such.

C. Where HE features have been continued farther uphill than the NHDH flow line, the ReachCode is extrapolated along the main channel identified, but the FCode is set for one tier lower of flow frequency. Typically in Marin county, NHDH flow lines reach an upstream limit with intermittent flow. If the ReachCode is extrapolated in this situation, the FCode would be downgraded from 46003 (stream/river Intermittent) to 46007 (stream/river ephemeral).

D. Transcribe the GNIS_ID and GNIS_Name in a similar fashion.

Introduction V.
Setting up a New Work Area for Conflation (Data Management)

Create a New Folder in personal folder that is preferably located on an external (fast like USB 3.0) drive directly connected to your workstation .(From the appropriate folder in Catalog, right click, New, New Folder.) Give it a name that reflects the corresponding HUC12 extent e.g., Sausal_HUC12

Create a New File Geodatabase in that folder. (From the appropriate folder in Catalog, right click, New, New File Geodatabase.) Give File GDB a name that reflects the corresponding HUC12. e.g., Sausal_HUC12

In the new File GDB, create a new Feature Dataset. In the Catalog tab, the GDB’s context menu , New > New Feature Dataset.. Name it in a way that describes the project’s projection such as U84_N88m_data, for WGS 1984 UTM Zone 10N and NAVD 1988 meters.

Create a new map document with name like Sausal_HUC12_conflate_yyyymmddx that includes a year-sorted date stamp and trailing version letter such as Sausal_20131217a.mxd Make the recently created geodatabase be the default geodatabase. Save it in the newly created folder.

Add Data. Navigate to X:\Marin_NHD_2013, HE_Archive folder, MNHD_HE_archive u84_data, USGS_Marin_HUC12_PG.

When that’s added, right-click on it and turn on Labels.

Using the Select by Rectangle,select appropriate polygon.With the polygon selected Data > Export Data to the new feature class. Give the output an appropriate name in the recently created home geodatabase, e.g. Sausal_HUC12_pg.. Save it as a map layer.

Using Analysis Tools > Proximity > Buffer, give it a buffer of 150m. Be sure to put the output into the Feature Dataset within the Home (or default) geodatabase. Give it a name like: xxx_HUC12_150m_pg.

Add Data MNHD_HE_Archive, MNHE_HE_merge_(latest version)

Using Analysis Tools > Extract > Clip, clip the HE lines to the buffered polygon. Be sure to put the output in the Feature Dataset in the Home (or default) geodatabase. Give it a name like xxx_HUC12_150m_clip_li.

Open an edit session on the new clipped lines feature. Select by location, with Add to Current Selection, select and delete lines flowing out. Add lines flowing through.

Get USGS FCodes and ReachCodes: Add Data. Navigate to X:\Marin_NHD_2013, HE_Archive folder, MNHD_HE_archive u84_data, USGS_NHDH_20130722_FCode_RC_diss_li.

Using ArcToolbox > Analysis Tools > Extract > Clip, clip it to the buffered polygon. . Be sure to put the output in the Home (or default) geodatabase and add “_clip”: (this might get truncated so be careful to get it all.)

USGS_NHDH_20130722_FCode_RC_diss_li_clip

Next get the USGS GNIS_IDs and GNIS_Names: Add Data. Navigate to X:\Marin_NHD_2013, HE_Archive folder, MNHD_HE_archive u84_data, USGS_NHDH_20130722_GNIS_flow_li

Using ArcToolbox > Analysis Tools > Extract > Clip, clip it to the buffered polygon. . Be sure to put the output in the Home (or default) geodatabase and add “_clip: (this might get truncated so be careful to get it all.)

USGS_NHDH_20130722_GNIS_flow_li_clip.

It might be a good idea to remove the unclipped layer at this point.

For the clipped FCode and ReachCode lines, go to Properties, Symbology, Categories, set the Value Field to FCode, and Import Symbology from another area that already has the symbology desired saved as a layer file,, e.g. Sausal …. diss_li. (Remove All, then Add All Values if necessary.)

Add the similar Symbology to the Local lines. Adjust the colors and sizes so that they are corresponding but visually distinct.

To get the labels for the FCodes, from ArcToolbox >Data Management Tools > Domains >Assign Domain the Field. e.g.,

Input table Sausal_HUC12_150m_clip_li
Field Name FCode
Domain Name: MNHD_FCode_domain2

Click OK

Open the Attribute Table for xxx_HUC12_150m_clip_li. Add two fields. Name the first field GNIS_ID, make it a text field with a length of 10. Name the second field GNIS_Name and also make it a text field but with a length of 120.

Guiding Principles for Conflation:
If USGS NHDH lines are present where no local HE features exist in HE_Archive, create new line features in xxx_HUC12_150m_clip_li. These will be merged back into the HE_Archive at a later date.

Only conflate our local resolution lines as far as the NHD lines extend. Upper extents are likely to have their own Reach Codes and FCodes generated at a later step.

Introduction VI.
Preparing terrain from LAS

A. Terrain Work Setup Identify the location of your .LAS and .EEX tiles and working directory where the grid will be built. Prepare a File Geodatabase with a Feature Dataset that has both projection (horizontal) and position (vertical) defined. The California-Marin County system has made use of US National Grid a WGS 1984 UTM (meters) projection and WGS 1984 (Geoid 2003) NAVD 1988 position. Also, identify from metadata the projection and position of your LiDAR source data and prepare a Feature Dataset that corresponds to those. If building terrain for just one HUC-12 area, it might be helpful to copy the LAS tiles specific to the HUC-12 into a subdirectory named “LAS_tiles” in the work area.

In ArcGIS 10.1, turn on the LAS Dataset toolbar and use ArcTools > Data Management Tools > LAS Dataset > Create LAS Dataset and save the resulting .LASD in the work directory. The LAS Dataset is a very compact index to the directory of LAS and EEX tiles that were identified above.

When created and included in the map document, the LASD will likely appear as a grid of red-outlined tile boundaries, and if one zooms in far enough, the LASD will be displayed as either points or a dynamically-generated TIN based on selections in the LAS Dataset toolbar. It is even possible to dynamically generate and display contours, which can be exceedingly useful for data QA. Context-click the LASD in the ArcMap’s Table of Contents and open Layer Properties > Display Tab. The Point limit identifies the number of points where display will transition from outline tile boxes to individual points or TIN. In the Layer Properties > Filter Tab note how the LiDAR classification codes can be selected by both number and description. Not all classes will be present in every set of data, but Marin County data has contained useful data in class “2 Ground” and “9 Water”

B. Manually QA LiDAR Class Codes One key to fine-tuning the import of LiDAR data into a refined terrain is having proper class codes. A major source of contamination can be LiDAR returns from a water surface classified as ground returns. In this case, it may be desirable to manually craft a polygon feature for the known water surface and apply ArcTools > 3D Analyst Tools > Data Management > Set LAS Class Codes Using Features to set points within the polygon to class 9/water.

C. Import LAS to multipoint features While the LAS Dataset tools can be very useful to explore and to interactively edit classes of points, to perform an ArcHydro analysis on the area it is necessary to derive a gridded surface from the LiDAR point cloud. One path for this conversion is to import selected classes of points from the LAS tiles into multipoint features stored in the File Geodatabase, and then build an ESRI Terrain Dataset (within the Feature Dataset that contains these multipoint features). Once built, the Terrain Dataset can be exported to other formats including TIN and GRID, and that grid can be analyzed by ArcHydro.

Identify the desired LiDAR point classifications by exploring the LAS Dataset and record the numeric expressions. Then run ArcTools > 3D Analyst Tools > Conversion > From File > LAS to Multipoint. If building a very large area, it may be necessary to group the LAS tiles into separate build areas. If doing so, do not scrimp and use an entire tile overlap for adjacent build areas! This will avoid potentially serious interpolation conflicts at the build area boundaries later. Experience has shown that overlaps as large as 1500 meters are worth the trouble to maintain integrity of the resulting terrain. If building just one HUC-12 area, for input simply select all the LAS tiles that you copied into your LAS_tile subdirectory; do not also select the associated .EEX files.

For output, name a new Feature Class within the Feature Dataset that was created to correspond to the projection and position of your LiDAR data. Estimate Average Point Spacing from the preview of the specific classes previewed with the LAS Dataset toolbar; for ground class points in hilly forested areas, expect this to be substantially larger than the flight spec.

Input the numbers for the class codes (e.g. 2 and 9 for ground and water). If you have a short list of classes chosen, be willing to take any Return Values that match your choices.

Pick the coordinate system by navigating to the Feature Dataset that you created to match the LiDAR data.

Preferably, you are using the elevation units as delivered in your data. Except in special circumstances, when working with 3D data it is good practice to try and keep the X, Y, and Z units all the same.

When the conversion is complete, you should have a single multipoint-Z Feature Class in your LiDAR data-matching Feature Dataset. If it is really necessary to reproject, then import your multipoint Feature Class into the second Feature Dataset that matches your project projection.
Modifications to either Z units or Z positioning will require separate steps.

D. Create an ESRI Terrain Dataset The Terrain Dataset will create a vector data pyramid to permit rapid interactive display of even the largest collections of terrain data, and allows the combination of multipoint, polyline-Z, and clipping polygon features. After the previous step, multipoint features have been loaded into a Feature Dataset. Create a new polygon Feature Class in this same Dataset with a name like “terrain_build_clip_pg”, open it for editing, and key coordinates to create a bounding polygon for the LAS tiles visible in the LASD. This can be done by selecting Create Features > Construction Tools > Rectangle then in the data frame right-click to check Horizontal, and then right-click to select Absolute X, Y… then key in a corner like upper left, then right-click to Absolute X, Y again and key in an opposite corner like lower right. Save Edits.

If you are fortunate enough to have reliable 3D break lines that are also compatible with your LiDAR data (not the most likely scenario), then ensure that these break lines are positioned in the same vertical datum with the same Z units as your projected multipoint features and add the polyline-Z break line Feature Class to your Feature Dataset.

If there are reservoirs within the analysis area, continue with the Introduction VII section to estimate pre-inundation topography and generate raster for analyses.

Introduction VII.
Estimating pre-inundation topography under reservoir pools to meet WBD Standards 3.6.4

To support WBD Standards (Fourth Edition, 2013) Section 3.6.4 requirements, when HE features have been estimated from historic maps of pre-inundation creek channels it is further necessary to estimate the inundated topography to properly estimate confluences within the reservoir pool.

If you have hydrologic enforcement (HE) breaklines and wish to use those to create tapering drains through inundated areas such as reservoirs and engineered lagoons, then do not import water-classed LiDAR points within the inundated area from LAS to the Terrain Dataset. This should leave a data gap over the inundated area.

In the reservoir area, copy a moderate number of through-going HE breaklines to provide geometry for the drain paths that could be derived from historic pre-inundation topographic maps. Call these like Res_spline_li and run the main channel starting from natural stream bed below dam, and for all main channels (receiving tributary flow) to natural bed above reservoir pool. In Marin County, a compromise path is typically chosen that runs from the downstream creek and follows the spillway over the dam and then quickly converges back to the historic drainage path. In this way the resulting hydro-enforced drainage properly shows the low-flow path from a full reservoir.

Clip tributary lines upstream at the reservoir pool boundary. Verify that the main channel HE features are continuous, and then Densify the vertices of break lines to fit with the grid size (60cm for 1m grid, 35cm for 50cm grid) using tool ET GeoWizards > Polyline > Densify to produce output like ResName_spline_dens35cm_li then drape the spline on existing terrain with ArcTools > 3D Analyst Tools > Functional Surface > Interpolate Shape over existing terrain (Interpolate Vertices Only) to output like ResName_spline_dens35cm_liz, then run ArcTools > 3D Analyst Tools > Functional Surface > Add Surface Information (with Output Property Z_min and Z_max at least) on the _liz save elevations along the main channel.

Next, run tool ET GeoWizards > Convert > Polyline to Point on Vertices and check Calculate point position along polylines (do NOT check Preserve Z(M) or Remove Duplicate Points) to an output like ResName_spline_v35cm_pt (for 35cm densified-vertex points). The ET Geowizards conversion is necessary so that a fractional length position attribute ET_ORDER is calculated.

For the main outlet channel, identify the outflow elevation [bot] and the top elevation that is just slightly below ground surface, and apply this formula to interpolate Z values along the points for a given flow path

uppermost elevation = top bottommost or confluence elevation = bot
ET_M = bot + ([ET_ORDER] * (top - bot) )

If several flow paths will be used in the terrain, then interpolate the longest one first, and then use its interpolated values to guide the bottommost elevations of any tributaries that join it. With this approach, each path must be interpolated individually, so select the paths carefully.

As a practical measure, it can be helpful to leave the line features selectable in the map document, then select one channel at a time and use ArcMap > Selection > Select by Location on the _v35cm_pt features. selecting This yields a Multipoint_Z Feature Class ready to be included in the Terrain Dataset that will estimate pre-inundation terrain. By allowing Duplicate Points to have been converted, in tributaries this selection should get all points along the tributary and a single point from the main channel at the confluence. If the target HE_Z field has been set to null before the calculation it is very easy to see the confluence elevation. Copy the confluence Z value into an open spreadsheet and into the ArcMap Field Caclculator dialog box; the spreadsheet should have a formula to subtract the confluence elevation from the reservoir pool height so that the ([top] - [bot] ) value can be copied into the Field Calculator. Once Z values for one tributary have been solved, select the next tributary’s line, repeat Select by Location and continue through all tributaries being splined within the reservoir.

Once the main channel has interpolated elevations along its length, then use its value at confluences as [bot] for the other large channels. Then for all tributaries, use the already-interpolated values of main channels at their confluences as [bot], and the pool height at the shore line at [top], interpolating Z values for all tributaries in turn.

After the interpolations along each tributary have been made, the main channel is removed from above the spillway to all points below, and points that cut through causeways are also removed, to maintain the integrity of dams and causeways in the topographic surface. These flow paths will be restored by by hydro-enforcement features before before ArcHydro workflow, but would otherwise create artifacts in topographic contours.

When all spline points have an appropriate Z value, use ArcTools > 3D Analyst Tools > 3D Features > Feature to 3D By Attribute to convert these to 3D points with name like _ptz (at 10.2 this tool requires all fields to be visible before the conversion to avoid errors), and use ArcTools > Data Management Tools > Generalization > Dissolve with ET_ID (saved from the spline_li OBJECTID) as dissolve field to name like _mptz to suggest that these are Multipoint_Z features.

Use the shoreline feature to create a polygon that matches inundation at the time of the LiDAR flight, and within this area use ArcTools > 3D Analyst Tools > Data Management > Set LAS Class Codes Using Features to set each reservoir inundation area to its own class code. Several ASPRS reserved classes in the 20--30 range should be available for this use. Once set, the LAS files may be tapped with an export of only the reservoir’s class in ArcTools > 3D Analyst Tools > Conversion > From File > LAS to Multipoint to yield a single (large) multipoint Feature Class from all the reservoir’s associated LAS tiles; be certain to apply a processing extent that matches the rectangle of all associated LAS tiles---to filter out extraneous LiDAR returns that might be mislocated at [0,0] or elsewhere far away.. Then, by applying ET GeoWizards > Convert > Multipoint to Point each point can be viewed with its associated elevation in ET_Z attribute. Simply opening the attribute table, right-clicking on the column and viewing Statistics can show the mean Z and standard deviation of the reservoir surface. Once the approximate height of the reservoir is known, consider applying a select or editing out LiDAR returns with wildly extraneous elevations near 0 or very high, so that surface standard deviation is not much more than the survey’s vertical 95% RMSE (error). This yields a useful mean pool height for the reservoir as surveyed.

In a similar way, use ArcTools > 3D Analyst Tools > Conversion > From File > LAS to Multipoint with only Class 2 (ground points) to export a single (large) multipoint Feature Class of all ground points in the patch extent; be certain to apply a processing extent defined by the grid-aligned clip rectangle. This provides the bulk of the bare-earth masspoints around the reservoir. Consider a name that includes the survey and the patch area as well as the selected class, such as ARRA_Nicasio_cl2_mptz. This convention helps to clarify the Terrain Dataset components.

Densify the vertices of the shore line Feature Class to fit with the grid size (60cm for 1m grid, 35cm for 50cm grid) using ET GeoWizards > Polyline > Densify then ArcTools > Data Management Tools > Features > Feature Vertices to Points for all vertices and add a Double field for elevation. Calc into that field the mean pool height from the previous paragraph. Then run ArcTools > 3D Analyst Tools > 3D Features > Feature to 3D By Attribute using this just-calculated field for Height Field. In the resulting Feature Class identify an Long integer field and repurpose it for a multipoint dissolve by calcing in the following formula

diss = 1 + ( [ObjectID] / 3500).

Then, use ArcTools > Data Management Tools > Generalization > Dissolve on the Point_Z input features with the just-calced field as the Dissolve_Field. This yields a Multipoint_Z Feature Class that can efficiently participate in the a Terrain Dataset that will be used to estimate the pre-inundation topography.

When available, build a new Terrain Dataset from the LiDAR ground-classed points, the densified shore points, and the channel and tributary points (all multipoint-Z features), plus the grid-aligned rectangular clipping polygon as a soft clip feature. When the terrain has been built, use ArcTools > 3D Analyst Tools > Conversion > Terrain to Raster to express the new patch in a compatible raster form. With the newly-created patch terrain as input, consider an output name like ResName_patch_tbsm50cm.img for ERDAS Imagine format, Output Data Type FLOAT, and typically attempt to use Method NATURAL_NEIGHBORS, Sampling Distance CELLSIZE 0.5, Pyramid Level Resolution 0.
In Environmental Variables > Procesing Extent, use the grid-aligned rectangular clipping polygon, refer to the base countywide tbsm50cm grid in Snap Raster; In Environmental Variables > Raster Analysis > Cell Size again refer to countywide tbsm50cm grid and verify that size is 0.5; also set the Mask to the clipping polygon. In Environmental Variables > Raster Storage consider setting the pyramid resampling to BILINEAR, Resampling Method to BILINEAR, and deliberately set NoData by swiping the default NONE and in its place entering -150 to match countywide NoData definition.

When the 50cm terrain patch has been generated, immediately run ArcTools > Spatial Analyst Tools > Surface > Slope to generate a Percent Rise image for QA. Examine the reservoir areas for interpolation block anomalies that are present in ArcGIS 10.2 Terrain to Raster. If necessary, define a small patch rectangle for just near the affected area(s) and regenerate Terrain to Raster over that smaller area. Repeat the slope image generation and QA the new result.

When complete, the patched terrains for each reservoir can be merged back into the countywide terrain using ArcTools > Data Management Tools > Raster > Raster Dataset > Mosaic To New Raster. The current countywide terrain should be entered into the list first (which will place it on the top of the list), then add the patches for various reservoirs later in the list. Choose a reasonable output location, as the result will be as large as the original countywide terrain. Include the updated date as part of the new tbsm50cm name. Pixel_Type must be 32_BIT_FLOAT, cellsize must be 0.5 and Number of Bands is 1. If the countywide terrain was listed first, then Mosaic Operator of LAST will work as intended to patch the reservoirs.
For safety, in Environmental Variables, use the countywide terrain in Processing Extent, Snap Raster, Raster Analysis > Cell Size and Mask. Choose Raster Storage > Bilinear pyramids, Bilinear Resampling Method, and manually set NoData to -150 for consistency (in Marin County).

When resampling the 50cm terrain, use ArcTools > Data Management Tools > Raster > Raster Processing > Resample with BILINEAR resampling, provide the desired output resolution such as 1.0 or 2.0 meters, and use the exact same output size in both X and Y to keep square pixels in output. Verify that the resample has not shifted features relative to the original 50cm grid, to precisely align features the model-derived flow line and watershed features even when they are dervied from a 1m subsampled version of the original terrain. Avoid using the output from ArcTools > Spatial Analyst Tools > Neighborhood > Focal Statistics to directly perform a resampling, as Spatial Analyst’s implementation of kernel convolution appears to assign kernel results to a corner pixel---shifting the output by ½ the kernel dimension. The Data Management Resample tool does not perform convolution filtering, but it does properly register the resampled with the original image.