Sunteți pe pagina 1din 4

5 th AGILE Conference on Geographical Information Science, Palma (Balearic Islands, Spain) April 25 th - 27 th 2002

Various Data Sources of Different Quality for DTM production

Tomaž Podobnikar Scientific Research Centre of the Slovenian Academy of Sciences and Arts Gosposka ulica 13, SI-1000 Ljubljana, Slovenia e-mail: tomaz@zrc-sazu.si

Keywords: digital terrain model, surface, heights, geographical information system, quality control, statistic analyses.

Introduction

The form of the Earth’s surface can be described with a model, recorded as continuous and usually smoothed surfaces. Surfaces are defined with a finite set of heights, measured with regard to mean sea level. Such models were in analogue form until forty years ago, but today are known as digital terrain models (DTMs). Basic principles for the management, acquisition, recording, updating, spatial analyses, visualisation, and integration of such models with other systems are well known. Despite this, new developments provide the opportunity for the improvement of DTM modelling.

The main goal of this article is the integration of existing data of different quality and its implementation with convenient technology for high quality DTM production. The methodology refers to the production of DTM surfaces with respect to the different data quality of sources. Beside a contribution to modelling digital terrain model as science, an applied aspect of the study had been developed. It therefore includes practical suggestions for data selection, pre-processing, and evaluation decisions, the possibilities and limitations of particular techniques for quality control, interpolation parameters, etc.

One of technological challenges of the DTM production is the ever larger amount of digital data now available, which generally contains much semantic information about surface geomorphology. Despite this large amount of information, generally the mass of data does not seem useful for DTM production. Problems with such data mostly lies in its quality which is non-homogeneous, different methods and standards used for quality estimation and topological structure, making it unsuitable for terrain surface modelling. Different data quality is mainly an outcome of the method of acquisition, which can be directly or more frequently, indirectly acquired from generalised analogue data sources. Different estimation methods can cause unpredictable gross and systematic errors. Possible solutions for these problems, confirmed through applied experimentation, enable cost-effective and high quality production. In the article, it will be described that these data sources could be used as a basis for high quality digital terrain model production without the requirement of additional data acquiring.

DTM Modelling

The main part of the article will be devoted to two DTM production methods. The first method is that of simultaneous sources interpolation. The second is the weighted sum of sources with geomorphological corrections. Both methods rely on substantial knowledge of used sources and the theory of DTM modelling. Further stages of DTM production were divided into:

preparation,

pre-processing,

processing,

(management of DTM data)

Preparation

Preparation for DTM processing is important for identifying problems which may appear during DTM production. Beside theoretical activities, the practical act of modelling figures highly within the article.

Automatic regionalisation should also be mentioned along with complex methods which supporting DTM production. Beside of preparation for DTM processing, regionalisation was used for metadata monitoring and pre-processing, and processing. Regionalisation derived from regionalisation variables which were divided according to natural and anthropogenic characteristics of surfaces and with regard to sources and DTM quality. Variables were generated from different cartographic sources and spatial databases. Regionalisation were produced from variables by models (with mostly empirical parameters) based on map algebra operations. It considered on surface type and roughness (flat surface, hills, karst, mountains, etc.), and also with consideration

1

5 th AGILE Conference on Geographical Information Science, Palma (Balearic Islands, Spain) April 25 th - 27 th 2002

of the vegetation (which has an influence on data accuracy), available datasets (data is non-homogeneously distributed over Slovenia), previous knowledge about real terrain surfaces, and quality of sources. Within preparation, the sources were coded with appropriate parameters with regard to types of source.

Pre-processing

Pre-processing based on detailed visual and statistical quality control of sources, supported by automatic regionalisation. Data was also statistically and geomorphologically corrected and improved. Preparation for DTM processing is important for identifying problems which may appear during DTM production.

An assessment of the impact of overall sources quality was derived using visual controls. With statistical methods other aspects of data nature investigated. It was assumed that it would find a ‘global’ method for data tests, which would be effective for any geographical area. There are following steps were followed for statistical methods:

elimination of gross errors of reference points using reference DEMs (DTMs as grids)

statistical evaluation of non-point sources with reference points

systematic error elimination from non-point sources

For elimination of gross errors from reference points, a high quality reference DEM was produced. At a later date a weight coverage was processed as an another key data set. Weight coverages were based on prior knowledge of the nature of random deviations of the reference DEM. Regionalisation for this task considers terrain roughness, karstic qualities, vegetation, hydrological networks, standing water and other flat areas, transport networks, and operator error. ‘Method of variances’ were used to eliminate gross errors from reference points. The method compares absolute difference between reference points and reference DEMs with threshold (permitted distance), relating to the weight coverage. When threshold is exceeded, the relevant point was considered to contain a gross error. It was marked and eliminated.

The statistical evaluation of the non-point sources was allowed with a corrected reference set. After initial analyses, standard regions for metadata monitoring were used. The most important categorisations were selected in respect to: terrain roughness, slope, terrain skeleton (sinks, valleys, saddles, ridges, and peaks), and vegetation. The most important parameters were the RMS error, which whilst coarse and approximate, determines a random error, and mean error that accords with a systematic error.

With very precise parameters for sources quality, systematic errors should be eliminated from non-point sources. For this task, parameters were used for mean error and derived from statistical evaluation. Considering parameters and regarding standard regions, every source was corrected separately. Results of another statistical evaluation showed that systematic error elimination is necessary for DTM processing with data of variable quality.

Processing

Following DTM production evaluation two methods for DTM processing were developed and tested:

a method of simultaneous sources interpolation (producing the DTM and from it the DEM),

a method of weighted sum of sources – with gemorphological enhancement (direct processing of the DEM).

The method of simultaneous sources interpolation is based on interpolation of all suitable sources together. This used an appropriate interpolation method is known as ‘least square interpolation’ (Kraus 1998) which is based on computing the minimum variance or ‘linear prediction’. In geostatistics the method is known as ‘kriging’. The interpolation method is local and it uses so-called computing units. During interpolation it finds an appropriate function (theoretical surface) by considering particular points to which filter values are assigned. Filter values are also used to control a degree of the surface smoothens. Different filter values (as variances or weights) were assigned according to data codes (characteristics) and quality (different height precision).

Techniques, similar to selective and progressive enhancement (Makarovic 1973; Makarovic 1977) of data, play a very important role. This approach provides an alternative way to consider weights of sources, which are higher for denser data distribution. The processed results haven’t been analysed very precisely, but some results were excellent although requiring considerable effort.

The method of weighted sum of sources (with geomorphological enhancement) is easier to employ. It is (raster) GIS-orientated. In this case interpolation methods are very important during the pre-processing and with respect of the nature of sources and the final DTM resolution. The final resolution chosen for the DTM was 20 m, and all data sources were interpolated to the selected resolution. Spline is appropriate for DEM with resolution of

2

5 th AGILE Conference on Geographical Information Science, Palma (Balearic Islands, Spain) April 25 th - 27 th 2002

100 m interpolation to a 20 m grid, because it provided high quality source. For DEM with resolution of 25 m, bilinear interpolation was better (Longley et al. 2001), and for contour lines a LIDIC (Jaakkola and Oksanen 2000) interpolation method with many improvements regarding basic method was applied. Improvements were necessary to enable interpolation in the situations where there were missing contour lines. The main advantage of the weighted sum of sources as a method is that it is a natural extension of statistical pre-processing. The principal steps to produce a DTM (DEM) with this method are:

mosaicing to produce a basic DEM

‘parallel’ weighted sum of secondary data

geomorphological enhancement

reference point consideration, and

monitoring of result quality following the weighted sum of data

A basic DEM was first constructed as a mosaic of sources. For this task, sources that cover the area most homogeneously were chosen.

With ‘parallel’ weighted sum other, secondary sources were included within DTM production. Most secondary sources were local. For production of weighted coverages for data sources, a method was used similar to that for systematic error elimination, a difference was that parameters for random errors were also taken into consideration. Higher weights were assigned locally to source with a lower random error and for every combination of sources calculating arithmetic mean with weights. Thresholds (vertical differences) were controlled for every grid cell by a combination of weighted coverages. If the distance was too large, the area was noted. Individual selected areas were tested with reference points (if they were available for these areas). Higher weights were assigned to the areas to which reference points had lower absolute mean distances. If no reference points were available for a selected area, the area was considered to be of insufficient quality. With the ‘parallel’ weighted sum method, the reference points were included within some portion of the final DEM.

With ‘parallel’ weighted sum, the final DEM became a bit smoother than the original sources. Geomorphological correction of the DEM was then necessary. The geomorphological characteristics of statistically better sources were applied to the DEM. For that, a trend surface was produced with DEM generalisation. The same conditions of generalisation were applied for DEM and original source from which geomorphology had to be assigned. In the end the calculated difference between the trend of the source surface and the source were added to the trend of the DEM. With this correction a DEM was produced which was statistically insignificantly worse, but which was geomorphologically much improved.

Many reference points were available for testing, but they were also used for DEM quality improvement. With each step of the ‘parallel’ weighted sum, the terrain surface was closer to the reference points. To use reference points separately as the most accurate available data would be particularly sensitive on flat areas. It is very possible that with only a slight change of point height, geomorphology might be significantly and detrimentally changed. It was decided to use reference points only if they were within a mountainous area (with regard to regionalisation), if they were within the areas of sinks and peaks, if they were 1 st order trigonometrical points, and if they were close to a hydrological network. Inclusion of points was carried out with a smoothing interpolation relating to their nearest neighbour.

The Actual Results

The data, used for DTM production for the case study (Slovenia) were:

DEMs (grids) with resolution of 100, 25, 10 (local) and 5 m (local)

various geodetic points, cadastre points, points of Central database of buildings

spot elevations (hill numbers)

hydrography from maps in scale (scale 1 : 25,000; polygons of standing water)

lines of rivers (scale 1 : 25,000 and 1 : 5000)

contour lines (1 : 25,000 and 1 : 5000)

The techniques employed for DTM production were tested for each individual step of processing, and quality parameters were monitored. The study case was area of Slovenia and data, available for. Data, used on the test regions were as far as possible objective and describe every part of Slovenia and its surrounding. The results were confirmed with statistical and visual quality analyses for both proposed methods of DTM production. The results indicate statistically, geomorphologically and visually – a high quality and efficient DTM production in Slovenia, better than those currently used and present an expected average vertical precision of 3.5 m or better for the whole of Slovenia (with average relative high slope of 14.5º). An important additional product of this

3

5 th AGILE Conference on Geographical Information Science, Palma (Balearic Islands, Spain) April 25 th - 27 th 2002

approach to DTMs are automatically produced contour lines, terrain skeletons and

products include acquisition of quality parameters and reduction of gross and systematic errors in geodetic databases.

With combination of various sources, for about 50% to 80% better results were reached with regard the best data sources, which covers all of (tested) area for DTM modelling. The described methods are not useful only for DTM production with various data sources, when there are not enough resources to produce high quality DTM with “classical methods”, but even to combine high and lover quality data to evaluate and improve DTM.

hillshading. Secondary

References

Jaakkola, O.,

Oksanen, J., 2000, Creating DEMs from Contour Lines: interpolation techniques which save terrain

morphology. GIM international, (Lemmer, The Netherlands), Vol. 14, No. 9, 46–49.

Kraus, K., 1998, Interpolation nach kleinsten Quadraten versus Krige-Schätzer. Österreichische Zeitschrift für Vermessung und Geoinformation. Vol. 1/98, 45–48.

Longley, P. A., Goodchild, M. F., Maguire, D. J., Rhind, D. W., 2001, Geographic Information Systems and Science. John Willey & Sons, 454 pp.

Makarovic, B., 1973, Progressive Sampling for Digital Terrain Models. ITC Journal, (Enschede, The Netherlands), Vol. 3,

397–416.

Makarovic, B., 1977, Composite Sampling for Digital Terrain Models. ITC Journal, (Enschede, The Netherlands), Vol. 3,

406–433.

4