USGS Multimedia Gallery
This text will be replaced
To embed this video, click "menu" on the video player toolbar.
If no transcript and/or closed-caption is available, please notify us.
The National Enhanced Elevation Assessment: Preliminary Findings
Mark DeMulder: Our next speaker is Mr. Larry Sugarbaker. Larry Sugarbaker is the National Geospatial Program's Senior Adviser for the U.S. Geological Survey. Larry works on the National Map Policy Formulation and on New Initiatives, and actually what he's going to be speaking with you this morning about is what we hope will become a new initiative in the National Geospatial Program.
He has led major studies to understand customer requirements of the National Map and is currently leading a new study to assess requirements for a National Enhanced Elevation Program.
Prior to joining the USGS in 2007, Mr. Sugarbaker was Vice-President and Chief Information Officer for NatureServe, an international non-profit conservation organization. Mr. Sugarbaker worked in the State of Washington Department of Natural Resources for 22 years where he managed the Geographic Information System and also supported remote sensing and forest inventory functions.
Mr. Sugarbaker graduated from the University of Michigan School of Natural Resources with a Bachelor of Science in Forestry in 1977 and completed a Master of Science degree in Remote Sensing and Wildlife Management from the University of Michigan in 1979.
So please join me in welcoming Larry to the podium.
Larry Sugarbaker: Good morning. It is a pleasure to be here today. And my purpose is to give you some insights into the assessment that has been ongoing since last August to determine national level requirements for what we are calling 'enhanced elevation'.
And what I need to explain to you is that when we talk about enhanced elevation, we are referring to not just a bare earth digital elevation model but the raw remote sensor data that are collected that are used to drive elevation data in about a dozen or so derivative types of products that are elevation derivatives.
It is important to us that we understand within the National Geospatial Program what our customers need to accomplish their business activities. I think that the talk that you just heard is an excellent example of where there are national-level mission-critical needs to improve the elevation of data for our nation.
We also know that elevation data are very expensive to collect, and we know that it's going to take a broad community of support in order to achieve these objectives. So we do have a very strong outreach strategy to share the result of the work and to build communities where it is necessary to move this forward.
The project which, as I said earlier, was initiated in August of last year is sponsored by the 14-member federal agencies of the National Digital Elevation Program Committee. So as such, we already have a community of support that is embodied in those 14 agencies.
Four federal agencies stepped up in our funding of the project: the U.S. Geological Survey, the National Geospatial-Intelligence Agency, the Federal Emergency Agency, and the Natural Resources Conservation Service.
The study is a fairly extensive assessment of these mission-critical-level requirements and it has taken some fairly significant commitments on the part of these agencies to make this happen. The USGS is the managing partner for the study, and so we're providing the leadership to make sure that the study is completed.
And then I want to acknowledge one other federal agency, and that is NOAA. They have done an extensive amount of work in understanding coastal issues of the United States and they have also done a significant amount of work to do inventories of existing elevation-related data such as LIDAR and LFSAR data that are available to support these studies. So they have been helping us with understanding sort of the state of the inventory that we have been working on as well.
And then also, we have leaned on many federal agencies, 31 of them, all 50 states, and many local governments and regional governments to respond to our call for information about their requirements.
So one of the first things that we need to be able to do, like climate change requirements, we need to be able to understand the business need of the organizations or our customers that we support through the National Map. And so when I talk about business needs, I'm talking about those activities that are performed in support of some mission-critical activity within their respective organizations.
And so for example, those that include things like fault mapping within the U.S. Geological Survey, or landslide inventories that are inflected in many states and federal agencies, or flood risk moduling, which is required in order to do flood risk mapping within FEMA and to assess vulnerability to other kinds of threats across the United States.
And then once we understand these requirements, we need to be able to develop cost-effective program implementation strategies and we need to look at alternatives for what the best model might be to actually pull this off.
And we have to be able to answer key questions. Is it more cost-effective for the government to manage these activities within the context of a national program?
We have a very effective partnership model that's in place today in which we work with many of the states. And as you will see later, as I show you the inventory you will see that many of the datasets that have been collected to date have been the result of state-led activities.
And so without these partnerships, the cost of the federal government would be much more than it is today. We need to figure out a program implementation strategy that really works and preserves that partnership relationship in order to be successful.
And then finally, we need to know that we have a cost-effective solution. Congress in these austere times expect that we are able to justify any new investments in this kind of data, and we have to have a credible story in place and justification in order to move a program like this forward.
There are many stakeholders for elevation data as we are learning. All 50 states are participating in the study and selected local and tribal governments are participating as well. We have received that we're near the completion of the federal data collection component of the study and we have received input now from over 30 federal agencies.
The private sector such as the forest products industry, development sector, energy sector and others have vital interest in assuring that there's high-quality elevation data available to support their work.
Now, we also need to turn to regional organizations and professional organizations. These are important for our outreach strategy and to provide that community of support which is necessary for us to be successful.
And so I wanted to just give you a little bit of background first before I tell you the details of the study and the results, and that is, first, I wanted to share with you this map of the United States showing the current status of the elevation data, the surface model elevation data that exists for the country.
There has been a program in place for many years. But if you look at this map and you were to study it, the very light areas in there may have elevation data that are as old as 90 years. 1923 is the oldest dataset within the national elevation dataset.
In the last 10 years, however, the program has really moved more aggressively towards modernizing this data, and those are sort of the olive-colored areas and the burgundy-colored areas that we see on the map. About 20% of the country today has more current elevation data available.
But that doesn't always mean that we have all of the derivative products that are required to support the needs such as the raw point cloud data from LIDAR, which is necessary to do biomass calculations to support climate change adaptation.
The reality is, it has always been expensive to collect elevation data. And the way it was done 50 years ago, 40 years ago, and even 30 years ago was for an individual to sit at an instrument looking at the stereo pairs of cartography and pushing the dot on the landscape to generate contour by contour by contour.
That's why it took over 50 years to develop the first national elevation dataset that was represented in the form of contours on USGS quadrangle. Subsequent to that, those contours were scanned and turned into the first digital elevation models for the country.
So the technology is changing, and today there are two technologies of choice that are being utilized to support elevation data kinds of source data collection activity.
The first one is LIDAR. I think you've probably all heard of these. There are some significant benefits to LIDAR in that it is an active system which shoots a laser beam onto the ground and collects many points per square meter that are very accurate to depict the profile of the surface of the earth.
They also capture the profile of what sits on the surface of the earth, so we get the vegetation, the buildings, and in some cases more detailed information than that. It has some disadvantages in that it cannot see through the clouds.
And this is where the second technology of choice comes in, and that is LFSAR LFSAR is a radar-based technology flown from aircraft, as is LIDAR, but it has a couple of distinct advantages. One is that it is able to penetrate cloud cover, and that it has a significantly lower cost of acquisition than LIDAR.
The disadvantage is the level of detail that is collected from LFSAR is lower than the number of applications because the nature of the data that it can support are fewer. Although sometimes other applications can be supported with LFSAR as well.
So it would be difficult for us to conclude today which of those technologies is the most important and which one we actually need to support these mission applications.
Here is a couple of examples of a LIDAR point cloud image that was provided to us by the U.S. Forest Service. And I'll just point to the lower right hand there, those linear features that you're seeing are actually power lines. So that gives you a sense for the level of detail that you can capture from that and the profile of the surface that might be created.
Traditionally, from the mapping perspective, we think about utilizing LIDAR or other raw remote sensor data to generate an elevation grid or a surface elevation model. And we also may utilize that to assist in land cover classifications and the development of structures or identifying the 3D profile of retracting structures from an image.
It's also used to rectify imagery so that we can have a map view or orthoimagery to look at as background on our maps or to create contours on, say, a topographic map and to improve the quality of the hydrography by being able to automatically tabulate flows and things like that.
But when we talk to the scientists in the federal agencies, they have a little bit different view about what it means to really support their elevation data requirements. They want to be able to do things like understand where the biomass is so that they can monitor the carbon sequestration and rate of removal of that biomass and what the effect of that is on the atmosphere.
They want to be able to do coastal studies and look at very detailed elevation profiles or to do earthquake fault studies where they actually, and I saw recently a presentation on this where they vary detailed elevation data, like 8 points per square meter, and they're actually trying to determine the historical change of that fault or the slip of that fault to understand what kind of an earthquake it was and what kind of threat that it poses to populations that live nearby.
The other thing that we are trying to understand much more effectively is what it means to be able to move from this patchwork quilt of high-quality elevation data that supports many project-by-project case studies to a national view. We know that we need to be able to scale our studies to think about what's happening to our country and to our earth on a national, more global basis.
That's critically important to climate change. It is critically important to the advancement of science and technology and other major activities across the country. These are the kinds of benefits that you can't experience if you don't have full complete coverage of high-quality elevation data.
I would just like to mention that much of this work is being done under contract. Dewberry is the contractor that's doing much of the requirements work for us. They have just been excellent and an excellent partner working with us to systematically address this need with each of the agencies and the others.
They are also being tasked to perform a cost benefit analysis in which we are trying to understand the benefits of having data to support each of these business requirements and then to make informed assessments of what a good national program would look like, such that we have the justification that I was talking about earlier in order to support an initiative to move forward for this kind of study.
One of the things that we are doing is trying to get a very good understanding of what the standing inventory elevation data and LIDAR-based data or LFSAR-based data that are publicly available today.
And so what you're looking at here is the preliminary results of that inventory. And a lot of the slides that you'll see that I'm showing have a big 'DRAFT' splashed across there. So what that really means is that there are some errors in the image that you're looking at, and we know that and we are in the process of validating those right now. For the data inventory, we will have that validation completed by the end of June, and for much of the other work, we're rapidly completing that validation process as well.
So all of the, I guess that looks red on your screen areas are areas where we already have LIDAR data that are publicly available. That's really a significant amount of data when you consider most of this data were collected in the last 10 years. The blue areas are where we have photogrammetric-based, more current data compiled. And then the pinkish areas that you see in sort of the Southern California coastal area is publicly-available LFSAR data.
Now, what I want to do is just quickly overdo a couple of overlays here. And the yellow area you're looking at is the national elevation datasets or one-ninth arcsec data that is the three-meter grid data that is housed within the USGS. And then the other colors are various datasets that are either in the planning stage, acquisition phase, or maybe have recently become available.
So you can see there are some opportunities to expand the national elevation dataset, but the reality is there are still large gaps across the country where no data are being collected in the moment.
We didn't collect information and we're just wrapping up this part of the study from 31 federal agencies. The impressive part of the study is that every single federal agency that we've contacted responded to our study. All but one of the federal agencies responded to the study and identified mission-critical activities or new and improved elevation data.
And so what I want to do is just sort of go through about seven or eight of these very quickly, very quickly because I have exactly five minutes left, but I just wanted to give you sort of a snapshot view of the kinds of things that are being identified as requirements for this elevation data.
And this first one, you saw this earlier from Joel. In order to support sea level rise vulnerability assessment, they need what is called Quality Level 2 data in all of the coastal areas.
So, quick primer here.
When we assess requirements for elevation data, we ask three basic questions. We actually ask more than that, but here are the ones that you'll see. We ask what quality of level data you need, Quality Level 1 being the highest quality, Quality Level 5 being the lowest quality that we actually assess requirements for.
Quality Level 1, 2, and 3 are different density levels of LIDAR-sourced data. Quality Level 4 is photogrammetrically compiled data. And then Quality Level 5 would be data which are derived from LFSAR In this whole table we have that sort of illustrate what the different densities are and the vertical and horizontal accuracies that you can expect from each those technologies. So for this particular study, they needed Quality Level 2.
We also ask, where do you need it? And that's what you're seeing here with the map. And then finally we ask, how often do you need it? And so for this particular requirement, you can see that the frequency is every 6 to 10 years, they need about a recrunching of that data in order to improve the understanding.
Also, for environmental protection, land cover characterization and runoff modeling, they need data nationally to meet these business requirements.
The Bureau of Indian Affairs, like all of the other federal land management agencies, are required to do land management plans every 10 to 15 years. And when they do that, they need to be able to consider the ecological implications of these plans not just on private lands but in any watershed that touches private land.
And one of the things that you'll see here is a lot of yellow, then, in the western states. And remember when we saw the inventory, we saw a lot of blank space in the western states. Mission-critical requirements for the Bureau of Indian Affairs.
The Bureau of Reclamation needs to be able to monitor river flows and fish habitats. So this sounds like sort of a wild requirement, but they really need to have a detailed profile of the elevation to inform these studies. And in the western states primarily is where they have this requirement.
The Fish and Wildlife Service also needs to be able to support their research on endangered species and fisheries and habitat conservation. Nationally, they require data like this.
And I'll go on. The list is National Geospatial Intelligence Agency needs 133 cities of high-quality elevation data every four to five years. The Department of Energy needs to understand the effects of climate change and other hazards on population dynamics and what they need to do in the event of a national emergency.
The Tennessee Valley Authority needs to be able to understand the vegetation profile as it relates to power transmission line planning and power line transmission protection. So trees fall over, they hit the power lines; they need to be able to monitor on a regular basis what the vegetation profile around power lines looks like. And they also need to be able to do other kinds of assessments which require high-quality elevation data across those entire areas.
And so significant need in coastal areas and in coastal states to understand light on EPA, the effects of climate change and other natural hazards such as tsunamis and hurricanes as they affect the coastal areas.
The Federal Emergency Management Agency has one of the greatest needs in order to support their risk map program. They have prioritized every watershed in the country for its vulnerability to flooding as it relates to population densities and they've identified their requirements for detailed elevation data in those to support of that program.
A sister program within FEMA, the Flood Insurance Rating Program, also requires elevation data. And the Environmental Protection Agency one more requires Quality Level 5 data for broad area air and water quality research.
And so it looks a little bit like this. And what I showed you are about eight or nine examples of the hundred mission-critical activities that we've identified across federal agencies, and this is just a small sampling. All of these programs are multi-million-dollar programs, and some of these programs have budgets that are in the billions, the multi-billion-dollar range annually.
So I don't need to show you the NRCS requirement for elevation data. They provide farm assistance to farmers across the country, and much of the work that they do is on the ground kind of work. If they were able to do much of that analysis in the office, they wouldn't be able to serve many, many more farmers more effectively.
So it is like a puzzle that we're putting together. And it's like a puzzle in more than one ways. But I kind of look at it like this, and this is where I think we're at. You know when you put a puzzle together, you go for the thousand-piece puzzle. This is the granddaddy of puzzles, right? But you start out with the corners, and then the edges, and maybe it takes two or three minutes to put each piece of the puzzle together.
But guess what? When you've got about 700 or 800 pieces in place, that last 200 pieces is really easy. So where are we at now? We're working on the edges, right? We're working on the edges. We're doing the hard part. We're figuring out what those pieces are, and we kind of know what the puzzle is going to look like and we're just starting to get the sense that, 'Ah, yeah, I think we can do it.'
And that's what this requirements assessment is about. We need to be able to provide through the National Geospatial Program elevation data services that meets an evolving need of a very large customer base. We know that that will lead to spatial data quality improvement.
Babs will be talking about her experience of work and research in cartographic presentation and the effects of the quality of the data on the ability to produce high-quality cartographic presentations is per volume.
We need to be able to think about how we can integrate data together across different kinds of datasets.
Babs and I worked on a research study with the Mapping Science Committee over 10 years ago now, right? And one of the things we talked about in that study was the importance of having foundational data that in turn drive data quality for all of the other kinds of data.
Well, guess what? Those foundational datasets are geodetic control, imagery and elevation. That's what it takes to build uniform, consistent quality datasets across the country.
We also know that part of this puzzle includes meeting a broader science community need that has never been identified before.
And then finally I think what we're going to see is new services are going to be fueled by private sector innovation. The reality is that we have patchwork coverage of elevation data. We don't see private sector innovation move as quickly as where we have uniform coverage of these datasets everywhere. Because once they can build applications on top of these things, new innovation occurs, and new things begin to happen.
So we're going to see a synergistic effect on the advancement of geosciences, which may be led by a dramatic improvement to the baseline or framework of data services across the country. So I think this is a corner piece of the puzzle, right? It's a corner piece.
Thank you very much.
Title: The National Map Users Conference: The National Enhanced Elevation Assessment: Preliminary Findings
The U.S. Geological Survey (USGS) sponsored the inaugural The National Map Users Conference (TNM UC) in conjunction with the eighth biennial Geographic Information Science (GIS) Workshop on May 10-13, 2011, in Golden, Colorado. The National Map Users Conference was held directly after the GIS Workshop at the Denver Marriott West on May 12-13. The focus of the Users Conference was on the role of The National Map in supporting science initiatives, emergency response, land and wildlife management, and other activities.
The National Map Users Conference Experience: Short interviews of Conference attendees. (4:29)
Opening remarks and plenary speakers, Thursday, May 12, 2011 (UC Day 1)
Award Ceremony, Tommy Dewald of the Environmental Protection Agency (EPA) and Keven Roth “semi-retired” USGS are the co-recipients of this year’s Henry Gannett Award, presented by Marcia McNutt, Director of the USGS and Alison Gannet, great-niece of Henry Gannett. Roth and Dewald were cited for their development of the National Hydrography Dataset (NHD). (30:03)
Continuing remarks and plenary speakers, Friday, May 13, 2011 (UC Day 2)
Closing Session: “What You Said: Shaping the Direction of The National Map”. (33:50)
Selected Sessions, Thursday, May 12, 2011 & Friday, May 13, 2011
Location: Golden, CO, USA
Date Taken: 5/13/2011
Video Producer: Michael Moore , U.S. Geological Survey
Note: This video has been released into the public domain by the U.S. Geological Survey for use in its entirety. Some videos may contain pieces of copyrighted material. If you wish to use a portion of the video for any purpose, other than for resharing/reposting the video in its entirety, please contact the Video Producer/Videographer listed with this video. Please refer to the USGS Copyright section for how to credit this video.
Suggest an update to the information/tags?
* DOI and USGS link and privacy policies apply.