Applied GIS Abstracts

October 31, 2017

13(2): Line of sight analysis for urban mobile applications: a photogrammetric approach, by Sreedhar, M., Narender, B. & Muralikrishnan, S.

Although inter visibility analysis between a GPS receiver and visible satellites is crucial for any real-time route navigation, traditional, mission-planning software does not incorporate it – they simply assume that the earth is flat. This paper seeks to correct the situation by outlining an advanced, semi-global matching technique which derives a Digital Surface Model (DSM), depicting the topography model of the earth surface for any study area. Our approach uses a GIS, combined with a freely available Global Navigation Satellite System (GNSS), to accurately predict pseudo satellite locations before building three-dimensional models of buildings from their footprints. We exploit high-resolution satellite data (Cartosat-1 and Geoeye-1) along with high-density elevation information, and we ultimately reduce the need for full-scale surveys and so minimize costs and time. Moreover, our combining of 3D models with GPS-derived, ephemeris data pertaining to densely built-up areas constitutes a valuable input for real time, mobile applications like car navigation systems, personal navigation, fleet tracking, asset tracking and general, location-based services.

Full Text

September 27, 2017

13(1): Estimating carbon content using two methods for spatial averaging: a case study within complex, native Siberian forests, by Gavrikov, V.L., Erunova, M.G., Mitev, A.R., Sharafutdinov, R.A. & Knorre, A.A

This paper studies spatial averaging of point-based measurements of carbon content within the landscapes of native Siberian forests. Two methods of spatial averaging were used: (1) pixel classification with learning from satellite images, and (2) GIS-based interpolation from point measurements. For the same comparison area of 734 hectares, satellite imagery estimated that there would be 112.1 Mg of carbon per hectare, whereas GIS-based interpolation suggested 126.77 Mg of carbon per hectare – 11–12% higher. The first method is more direct and it takes into account all of the pixel variety, but it works only for parameters that are seen from a satellite. The GIS-based approach is more mechanical (interpolation) and it is certainly sensitive to the number and arrangement of measurement points, but it is applicable both for forest canopies and what is below, for example, soil parameters. We conclude, therefore, that although the GIS-approach might overestimate the carbon content of forests, it still gives reasonably accurate values.

Full Text

November 27, 2016

12(1): Evacuation vulnerability after an urban earthquake: mapping it using a GIS, by Tsionas, I., Baltzopoulou, A., Tsioukas, V. & Karabinis, A.

The danger that buildings and other human constructions pose to people after serious seismic events has been researched for a long time. Vulnerability and risk can now be estimated using well established methods. However, the danger that people face after an earthquake when they are moving to reach a safe spot has not been thoroughly researched. Accordingly, we address evacuation vulnerability in this paper. We chose four parameters of danger – seismic risk of the buildings lining the street, buildings’ heights compared to street width, street slope and street traffic conditions, and we then assigned them to those street segments which might make up post-earthquake, evacuation routes. The results were then plotted using a GIS. In this way, we generated a useful map that clearly highlighted the street segments to be avoided as part of evacuation routes. In addition, we identified several areas with urban characteristics that increase evacuation vulnerability.

Full Text

December 17, 2015

11(2): Guest editorial: evaluation of research performance, by Benke, K. & Kilic, S.

Guest editorial

Full Text

September 30, 2015

11(1): Generating land-cover maps from remotely sensed data: manual vectorization versus object-oriented automation, by Machala, M., Honzová, M. & Klimánek, M.

Manual vectorization of multispectral images is a widely used
method for making land-use or land-cover maps. Although it is usually
considered relatively accurate it is very time consuming, which has
prompted the use in recent years of various semiautomatic methods for
classifying remotely sensed images. One of the most promising of the
latter is object-oriented image analysis based upon image segmentation,
but the accuracy of its results, as well as its time demands, are disputed.
Accordingly, this paper compared manual vectorization with object-oriented
classification to reveal the strong and weak points of each. Two qualitatively
different datasets were classified using both methods; time costs were
monitored and accuracy levels were compared. It was found that manual
vectorization achieved better overall accuracy (up to 93% versus 84%), but
the semiautomatic method was usually more accurate when classifying
some specific features such as roads, built-up areas, broadleaf trees and
coniferous trees. The verdict regarding time-efficiency was less clear cut.
The best method depends upon the spatial and spectral resolution of the
data being processed.

Full Text

December 17, 2014

10(4): Modelling the spatial pattern of house-renovation employment in Melbourne, Australia: an application of geographically weighted regression, by KC, K., Chhetri, P., Arrowsmith, C. & Corcoran, J.

This paper discusses research aimed at identifying key factors influencing the distribution of residential, housing-renovation employment in metropolitan Melbourne. Using Geographically Weighted Regression (GWR), employment that is focused on residential-housing renovation is modeled using six, mostly spatial parameters – distance to the central business district, median household income, distance to highways, the number of nearby shopping centres, distance to public open space and accessibility to railway stations. Of these, the Ordinary Least Square model showed that distance to the central business district and distance to public open space were statistically significant. Speculations were then made about the extent of different, independent variables’ influences upon variation in residential-housing renovation employment by mapping local coefficient estimates.

Full Text

July 23, 2014

10(3): Geo-environmental assessment to identify a least-cost road in Ghana, by Kursah, M.B

Unlike many studies which simply generate least-cost paths, this paper develops a geospatial methodology for amalgamating many geo-environmental factors in order to determine the costs of construction for pre-defined or existing roads. The geo-environmental factors used were elevation, presence of watercourses, soil characteristics and land cover type. They were reclassified to cost values/attribute weights based upon their impact on road construction, and they were then combined, using the weighted overlay tool in ArcMap, to generate a thematic cost layer. The total construction cost for each road was then extracted from this layer using the Extract by Mask tool. Although the cheapest road was longer, it was less costly by 3.2%, mainly because of its higher elevation, avoidance of major valleys, suitable soils and less problematic land cover, all of which reduced the need for cut-and-fill and culverts. It is suggested, therefore, that government agencies adopt this powerful technique for reliable and well integrated road planning and assessment. Nevertheless, the method could be improved by including additional factors such as proneness to floods and access to people and socio-economic activities.

Full Text

June 18, 2014

10(2): Accuracy of residents’ perceived home locations on an environmental risk map, by Severtson, D.J. & Ness Sundeen, K.H.

When people view a map depicting an environmental risk, they tend to look for risk information near the perceived location of their residence. For some types of maps, the accuracy of perceived map location will influence perceived proximity to mapped hazards, which in turn, can influence proximity-based beliefs and decisions. As such, it is important to measure and understand the accuracy of perceived map location. The primary aims of this field study were to measure and describe perceived map location and examine how map landmarks and personal characteristics were related to accuracy. Maps depicted a drinking water hazard. Participants (n = 57) drew an X on a paper map to indicate perceived home location. The X’s location was georeferenced using geographic information systems (GIS) software. Accuracy was the geographic distance between perceived and actual map location and varied widely across participants (0.04 – 3.99 miles on a 62 mile map; 0.06 – 6.41 km on a 9.662 km map). A lake landmark and participants’ perceived numeracy were related to the accuracy of perceived map location. Concepts from spatial and map cognition offer plausible explanations for study results.

Full Text

March 7, 2014

10(1): GIS-based simulation of land use change, by Okwuashi, O. & Ikediashi, D.I.

This paper describes an application of ArcGIS to simulate land use changes in Lagos, Nigeria, using both Ordinary Least Squares (OLS) regression and Geographically Weighted Regression (GWR), over three epochs – 1963-1978, 1978-1984 and 1984-2000. Twelve salient, causal factors thought to be related to urban land use change in Lagos were used, such as distance to water, distance to residential structures, income potential and population potential. OLS was used to establish the regression coefficients, gauge the significance of each explanatory variable and estimate conformity with linear regression criteria. GWR was then used to simulate the urban form based on the results of the OLS model. For the three epochs the respective Kappa statistics for the simulated maps were 0.8858, 0.8366 and 0.8812, indicating an almost perfect agreement with the data of 1978, 1984 and 2000.

Full Text

February 26, 2014

9(2): Collaborative web mapping and volunteered geographic information: a study in Nigeria, by Bello, I.E. & Ojigi, L.M.

This paper evaluates developments in mapping – from traditional, desktop cartography through to Collaborative Web Mapping (CWM), and it uses an experimental example from a developing country, Nigeria. We examine the status and limitations of CWM as a viable tool that freely contributes towards using globally assessed and free Web Mapping Platforms (WMPs) such as Google Maps, Google Earth, OpenStreetMap, Bing Maps and Yahoo! Maps. Here we use Google Map Maker Web 2.0 to conduct an experimental mapping analysis using 50 volunteered staff drawn from different departments of the National Space Research and Development Agency (NASRDA) and the Centre for Satellite Technology Development (CSTD) in Abuja, Nigeria. We find that Geographic Information (GI) experts are faster at data integration, while the reverse is the case for non-GI experts due to limited knowledge of geo-spatial data management. We then describe how Focus Group Discussions (FGDs) revealed that all participants are willing to optimistically contribute towards a global map by using a free web mapping medium. Finally, we explain that limitations to the advancement of integrating Volunteered Geographic Information (VGI) in most developing countries like Nigeria include poor knowledge of geo-spatial data and analytical methods, unstable electric power supply, slow but high-cost Internet facilities, inadequate computer systems and lack of personal volunteerism.

Full Text

February 28, 2013

9(1): Living near high-voltage power lines: GIS-based modeling of the risk in Nigeria’s Benin region, by Felix Ndidi Nkeki

This paper demonstrates the capabilities of Geographic Information System (GIS) methods for identifying populations which are both exposed to Electromagnetic Fields (EMFs) and at risk of electrocution because they live within the zone of potential risk around power lines in the Benin region GIS buffering, overlay and address geo-coding were used to generate a database consisting of both spatial and non-spatial information from which it was possible to holistically visualize the areal extent of population at risk. Potential risk areas were classified into zones of double risk and single risk, and of the approximately 20 per cent of the built-up area that was shown to be exposed to electromagnetic radiation, double-risk zones accounted for 51 per cent of this and single-risk zones 49 per cent. Also, the majority of exposed zones were found to be in high-density, residential areas located in the periphery of the region. It is evident that our GIS-assisted database will enhance future epidemiologic research and serve as a framework for effective decision making.

Full Text

March 6, 2012

8(1): An optimisation model for spatially allocating commodity production and forecasting its impact on regional productivity, by Wyatt, R.G. & Benke, K.K.

The prototype Spatial Optimizer 1.0 program for allocation of commodity production is demonstrated to show its potential as a decision support aid for policy making. The case study involves a set of agricultural commodities in south eastern Australia and the program uses estimates of each grid cell’s soil suitability for each of the eight crops, both in the year 2000 and in the year 2050, by which time soil characteristics are expected to have been affected by environmental change. We first predict how much total regional production will result from a judicious re-location of commodity types, both under conditions of complete flexibility and when constrained by more realistic, upper and lower limits on production, and we compare such predictions with current production levels. We also estimate potential total regional revenue, both in the short term when current prices are assumed to remain static and in the long term when prices are assumed to change according to how much of each commodity is produced compared to its current output level, and we compare these results with current agricultural revenue. Our long-term estimates are based on year 2000 soil-suitability values and then on year 2050 soil-suitability values in order to gauge the probable impacts of environmental change. Finally, we run the 2050 simulation twice more, with one of the recommended, dominant crops removed in each instance. This generates maps of some localised concentrations of other commodities that will become necessary in the future if maximum revenue is to be retained after one of the more lucrative crops is discontinued.

Full Text

November 29, 2011

7(2): Using a GIS-based, Hitchcock algorithm to optimize parking allocations for special events, by Sarasua, W.A., Malisetty, P. & Chowdhury, M.

Clemson, a small college town in South Carolina, deals with a massive over-saturation of its transportation system during special events, especially during home football games, resulting in total system failure. This research has developed a methodology to optimize parking, using a Geographic Information System (GIS)-based transshipment algorithm, and it has produced great time savings compared with the individual, “manual” efforts of thousands of drivers attempting to find spaces where available. As such, this research constitutes an effective implementation of the Hitchcock Transportation Algorithm for solving a transshipment problem applied to parking lot distribution. Because the Hitchcock Algorithm considers the network cost for distributions, it gives very realistic solutions, and so a system equilibrium that minimizes overall system delay has been achieved through optimal parking assignment combined with pre- and post-game traffic control strategies. This has been validated using a simulation model that was developed for evaluating the strategies.

Full Text

April 19, 2011

7(1): Remote sensing of land cover’s effect on surface temperatures: a case study of the urban heat island in Bangalore, India, by S. Ambinakudige

Urbanization has substantially altered the earth’s surface, and cities’ impervious surfaces for anthropogenic activities often generate an urban heat island (UHI). This paper analyses the effects of the UHI in Bangalore, which in recent years has witnessed tremendous in-migration of people and expansion of infrastructure due to rapid growth of its information technology, biotechnology and manufacturing sectors. Temperature values extracted from the Landsat satellite’s Enhanced Thematic Mapper Plus (ETM+) thermal bands and a “Normalized Difference Vegetation Index” (NDVI) were used to ascertain the relationship between vegetation cover and temperature Results indicate that the city core has a significantly lower mean temperature than the city’s outgrowth zones The presence of water bodies and vegetation in the city’s core helped to maintain lower temperatures than those found in the city’s outskirts, even though within the city core temperatures varied from 1 to 7o C within different land cover classes The continued expansion of urban infrastructure and new, residential neighbourhoods which lack vegetation seem to be contributing substantially to higher temperatures in the outgrowth zones.

Full Text

October 6, 2010

6(2): Land use change detection for environmental management: using multi-temporal satellite data in the Apodi Valley of northeastern Brazil, by Boori, M.S. & Amaro, V.E.

In this study maximum-likelihood, supervised classification along with post-classification change detection was applied to satellite images for 1986, 1989, 1996, 2001, and 2009, in order to map land-cover changes within the Apodi Valley region of northeastern Brazil. The supervised classification was carried out on the six reflective bands and ground truthed. The classification results were then further refined using ancillary data, visual interpretation and expert knowledge of the area along with GIS. Post-classification change detection then generated a change image in the form of cross-tabulations. Fifteen land cover classes existed within the area, and reclamation processes during the 1990’s changed them substantially, with conflicting changes being caused primarily by lack of both stability and consistency in the government’s land use policies. The result was extensive vegetation degradation and water logging in parts of the study area.

Full Text

March 21, 2010

6(1): A support system that delineates location-choice sets for firms seeking office space, by Manzato, G.G., Arentze, T.A., Timmermans, H.J.P. & Ettema, D.

Factors influencing the location decisions of offices include traffic, accessibility, employment conditions, economic prospects and land-use policies. Hence tools for supporting real-estate managers and urban planners in such multidimensional decisions may be useful. Accordingly, the objective of this study is to develop a GIS-based tool to support firms who seek office accommodation within a given regional or national study area. The tool relies on a matching approach, in which a firm’s characteristics (demand) on the one hand, and environmental conditions and available office spaces (supply) on the other, are analysed separately in a first step, after which a match is sought. That is, a suitability score is obtained for every firm and for every available office space by applying some value judgments (satisfaction, utility etc.). The latter are powered by a focus on location aspects and expert knowledge about the location decisions of firms/organizations with respect to office accommodation as acquired from a group of real-estate advisers; it is stored in decision tables, and they constitute the core of the model. Apart from the delineation of choice sets for any firm seeking a location, the tool supports two additional types of queries. Firstly, it supports the more generic problem of optimally allocating firms to a set of vacant locations. Secondly, the tool allows users to find firms which meet the characteristics of any given location. Moreover, as a GIS-based tool, its results can be visualized using GIS features which, in turn, facilitate several types of analyses.

Full Text

December 6, 2009

5(3): Application of GIS-based computer modelling to planning for adaption to climate change in rural areas, by Sposito, V., Benke, K., Pelizaro, C. & Wyatt, R.

A GIS-based computer modelling methodology was developed and applied to identify climate change adaptation issues arising in regional agricultural production systems (including forestry). Agricultural production in Australia is very susceptible to the adverse impacts of climate change due to projected shifts in rainfall and temperature. The methodology integrates land suitability analysis with uncertainty analysis and spatial (regional) optimisation to determine optimal agricultural land use at a regional scale for current and possible future climatic conditions. The approach can be used to recognise regions under threat of productivity decline, identify alternative cropping systems that may be better adapted to likely future conditions, and investigate implementation actions to improve the sub-optimal situations created by climate change. An example of how the methodology may be used is outlined through a case study involving the South West Region of Victoria, Australia. The case study provides information on the tools available to support the formulation of a regional adaptation strategy.

Full Text

June 26, 2009

5(2): Using Rational Polynomial Coefficients (RPC) to generate digital elevation models – a comparative study, by Jain, K., Ravibabu, M.V., Shafi, J.A. & Singh, S.P.

Models based on Rational Polynomial Coefficients (RPC) have recently sparked considerable interest within the remote sensing community because of their simplicity and accuracy. Indeed, some commercial, high-resolution, satellite imagery data are now supplied with RPC even though they do not disclose their physical sensor model. RPC, with stereo pairs, enable full photogrammetric processing including 3-D reconstruction, generation of digital elevation models (DEMs), orthorectification, block adjustment and feature extraction. In the light of this we here present a complete methodology for generating a DEM from stereo satellite images by using rational polynomial coefficients of the imaging geometry. We also conduct a study of the accuracy and performance, in terms of generating a stereo images-based DEM using RPC within three well known software packages. Our results are evaluated using sample data that was captured by IKONOS.

Full Text

January 8, 2009

5(1): Optimizing landscape value for man and nature: a case study of land-suitability mapping to conserve biodiversity in Lawaan, Eastern Samar, Philippines, by Casas, E.V. & Baguinon, N.T.

We show how to identify “hotspot” biodiversity areas on which to base relevant policies and management options whenever traditional, community-based, resource management puts biodiversity conservation at risk, as is the case in Lawaan, Eastern Samar, Philippines. Digital spatial data integration revealed that the lower elevation areas are under the heaviest human pressure and also have the highest biodiversity. This calls for a set of procedures for engaging the full range of stakeholders to identify areas for preservation. Accordingly, we combined Social Based (SB) and Environmental Based (EB) maps to identify four different classifications and identify the locations of “very critical” and “critical” areas that need priority if biodiversity-conservation efforts are to be effective. We also report the results of deploying developed protocols that are designed to support regular updates, thereby accommodating stakeholder interests so that an environmentally-based, zone map can form a basis for consensus building and preservation-zone protection via community enforcement.

Full Text

December 6, 2008

4(4): Development of an open-source-based spatial infrastructure, by Stefanakis, E.& Prastacos, P.

A pilot regional Spatial Data Infrastructure (SDI) has recently been developed for the Heraklion Prefecture in Crete , Greece , using Geographic Free and Open Source Software (GeoFOSS). This SDI is compatible with the geospatial standards and specifications introduced by the Open Geospatial Consortium (OGC) and it distributes the geospatial content on the web through widely accepted services (e.g., WMS, WFS, WCS and CSW). This paper presents the architecture, the components and the functionality of the Heraklion SDI. Specifically, it focuses on the map server, along with the services that provide accessibility to the data repositories, the spatial database server, the metadata and data catalog, the visualization tools and the web-client interface.

Full Text

August 6, 2008

4(3): Ortho-rectified, oblique, aerial photography for verifying and updating spatial data, by Bulman, D.

This paper explains how ortho-rectification of frame camera, aerial photography is fairly complex, and the difficulties associated with rectifying oblique aerial photos (OAPs) from small format photography have been so problematic that the approach has been little used and seldom described within the literature. Recent advances in photogrammetric software, however, have now made conventional, frame camera photogrammetry more convenient and more easily able to deal with the ortho-rectification of space-borne and air-borne sensors as well as hand-held film and digital cameras. Consequently, it is now possible to use oblique photography to correct vertical photographs – provided that we address issues related to image distortion and ensure that the reliability of extracted features is sufficient for their use within Geographic Information Systems (GIS). This will be demonstrated by showing that oblique photographs taken with a hand-held, 35 mm camera during a reconnaissance flight over a weed infested area can be ortho-rectified using modern software, thereby clarifying this approach’s status as a useful aid for delineating infestations for the purpose of say, ground truthing, verification of remotely sensed image analysis or simply contributing towards more comprehensive aerial mapping. The example used here may seem trivial but it actually illustrates, clearly, how a spraying pattern, while possibly too refined to show in a satellite image, can be mapped as a reference for monitoring weed infestations. This example is not intended to be a rigorous application of the technique, but simply a demonstration of its possibilities whenever there is a need to quickly or frequently update existing spatial data and environmental records in a much more cost-effective way than would be possible through deployment of the much more expensive, conventional, aerial survey methods.

Full Text

April 18, 2008

4(2): Using geovisualisation to support participatory problem structuring and decision making for an urban water utility in Uganda, by Kizito, F., Ngirane-Katashaya, G. & Thunvik, R.

This paper describes the application of geovisualisation to facilitate participatory identification and structuring of problems in an urban water supply system in Uganda. The city of Kampala has experienced rapid expansion over the years, with a corresponding increase in the demand for piped water supply. However, this demand was not well matched with expansion of the water supply system, and as a result parts of the city have been facing chronic supply anomalies and insufficiencies. Faced with the task of identifying remedies to the problems in the system, the city water company undertook a formal participatory problem structuring and decision analysis process, to try and understand the underlying causes of system failures as well as the geospatial patterns of these failures. As part of this process, analysis, mapping and geovisualisation of data derived from historical records of water consumption, as well as records of pipe breakages, supply intermittences, and other recorded customer complaints, was done. The maps so produced were key in bringing the various stakeholders and decision makers to a common understanding of the problem issues, and helped in the formulation of alternative courses of action. Furthermore, with the establishment of a formal discussion forum for problem analysis and decision making, structured participatory decision making was entrenched within the company’s work ethos. It is hoped that in future, the coupling of the geovisualisation tools with the existing operational databases in the company will result in the development of a functional spatial decision support system and a dynamic framework for system performance monitoring and reliability assessment.

Full Text

January 9, 2008

4(1): Evaluating forest harvesting to reduce its hydrologic impact with a spatial decision support system, by Zhang, Y., Barten, P.K., Sugumaran, R.& DeGroote, J.

Timber harvesting changes the condition of forest ecosystems, which are a major influence on the characteristics of headwater streams. Such characteristics include the quantity and timing of base flow and storm flow, concentrations of sediment and dissolved nutrients, water temperature, and the stability of the stream channels. This paper explores previous studies dealing with the relationship between timber harvesting
and its hydrologic effects, especially long term water yield increase. The watershed disturbance threshold theory is raised and investigated in detail. The development and evaluation of a spatial decision support system, the Harvest Schedule Review System (HSRS), is then described. The HSRS will aid in the minimization of hydrological impacts of forest harvesting, along with its related, negative environmental influences. It provides a spatially and temporally explicit tool for users to analyze the hydrologic impact of forest harvest schedules.

Full Text

December 6, 2007

3(12): Exploratory GIS modelling for assessing potential conflict in Australia’s central desert region, by Adams, B. Jon

This paper presents a simple methodology (preliminary and exploratory) to model potential hotspots of land-users’ conflict at regional level in preparation of a dispute system design. Australia’s central desert region is chosen and modelled in terms of four independent variables – Aboriginal communities, National Parks land, Pastoral lease and Tourism site, and one dependent variable – Strength of Interest. Preparation of the data is detailed in nine steps. Analysis takes place in two forms: overlays and statistical summary. Overlaying two coverages reveals potential conflict by demonstrating which interests have overlapping zones of interest. These zones are divided into areas of Strong, Medium, Weak and No Interest. Three insights from this method of analysis are discussed. Simple statistical summarization describes the conflict potential from the perspective of each respective group of interests and two insights are discussed. An unexpected insight was gained through this process showing potential conflict within groups of interest as well. Through this modelling exercise it is determined that a simple GIS application can produce significant insights in preparing a dispute systems design.

Full Text

November 19, 2007

3(11): Web-based GIS for mapping voting patterns at the 2004Australian federal election, by Shyy, T, Stimson, R.& Chhetri, P.

This paper describes a Web-based geographical information system (GIS) for mapping voting patterns at the 2004 Australian federal election at the polling booth level. The locations of polling booths are geocoded and linked with national digital datasets, including 2001 census data. The Web-based GIS can generate maps displaying patterns of voting for political parties across polling booths with overlays data showing the demographic and socio-economic characteristics of populations within the surrounding polling booth catchments. A classification functionality consisting of equal interval, quantile, median-based natural breaks and location quotient may be used to generate different map displays. The Web-based GIS has been developed as an information dissemination and analysis tool not only to benchmark voting outcomes but also to visualise relationships between voting patterns and demographic and socio-economic data.

Full Text

October 6, 2007

3(10): Categorising inconsistencies between national GIS data in Central Europe: case studies from the borders triangle of Belgium, the Netherlands and Germany, by Nilson, E., Köthe, R.& Lehmkuhl, F.

Nilson, E., Köthe, R.& Lehmkuhl, F.

Full Text

September 17, 2007

3(9): A comparison of spatial disaggregtion techniques as applied to population estimation for South East Queensland (SEQ), Australia, by Li, T., Pullar, D., Corcoran, J.& Stimson, R.

The accuracy of spatial disaggregation techniques largely depends on their underlying density assumptions and the quality of the data applied. This paper presents the results of a comparative investigation of four spatial disaggregation methodologies to determine their relative accuracies. These methodologies include binary dasymetric, a regression model, a locally fitted regression model and threeclass dasymetric, each of which provides different solutions for explaining spatially heterogeneous density when population data is spatially disaggregated. In contrast to previous studies, we apply the spatial disaggregation techniques to a comparably larger and more varied geographical area which allows the spatial disaggregation techniques to be more rigorously tested. Results indicate that the three-class dasymetric technique generates higher levels of accuracy compared to the other spatial disaggregation techniques and this result is more conclusive than previous findings.

Full Text

August 6, 2007

3(8): Per-pixel and sub-pixel classifications of high-resolution satellite data for mangrove species mapping, by Kanniah, K.D., Ng S.W., Shin, A.L.M.& Rasib, A.W.

High spatial resolution sensors such as IKONOS and QuickBird, are expected to classify mangrove species more accurately than coarse spatial resolution satellite images. Conventional per-pixel classification techniques could not improve the classification accuracy when such high-resolution images are applied. Such failure has encouraged the invention of more sophisticated and deterministic techniques i.e. subpixel classifications. In this study, the mangrove forest at Sungai Belungkor, Johor , Malaysia was classified using IKONOS data. Two classification approaches were applied, namely per-pixel and sub-pixel techniques. The conventional per-pixel classifiers used in this study were Maximum Likelihood (ML), Minimum Distance to Mean (MDM) and Contextual Logical Channel (CLC) while the Linear Mixture Model (LMM) was selected as the sub-pixel classification approach. The classification results revealed that the CLC classification with a contrast texture measure at window size 21 x 21 yielded the highest accuracy (82%) in comparison to the ML (68%) or MDM (64%). The spatial distribution of the classified mangrove species and classes coincided with the common mangrove zones in Malaysia . For the results of the LMM, the fraction of pixels measured from the satellite imagery and observed in the field gave a good correlation with an R2 value of 0.83 for Bakau minyak, a moderate correlation with an R2 of approximately 0.71 for Bakau kurap and an R2 of 0.75 for the ‘Others’ type of mangrove species. An error image was also created to compare the best fitting spectrum produced by the inversion of the LMM with the original observed spectrum, where the maximum RMS error was only 5%.

Full Text

July 14, 2007

3(7): Assessing the relationship between shire winter crop yield and seasonal variability of the MODIS NDVI and EVI images, by Fontana, D.C., Potgieter, A. B.& Apan, A.

Australian researchers have been developing robust yield estimation models, based mainly on the crop growth response to water availability during the crop season. However, knowledge of spatial distribution of yields within and across the production regions can be improved by the use of remote sensing techniques. Images of Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indices, available since 1999, have the potential to contribute to crop yield estimation. The objective of this study was to analyse the relationship between winter crop yields and the spectral information available in MODIS vegetation index images at the shire level. The study was carried out in the Jondaryan and Pittsworth shires, Queensland , Australia . Five years (2000 to 2004) of 250m resolution, 16-day composite of MODIS Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI) images were used during the winter crop season (April to November). Seasonal variability of the profiles of the vegetation index images for each crop season using different regions of interest (cropping mask) were displayed and analysed. Correlation analysis between wheat and barley yield data and MODIS image values were also conducted. The results showed high seasonal variability in the NDVI and EVI profiles, and the EVI values were consistently lower than those of the NDVI. The highest image values were observed in 2003 (in contrast to 2004), and were associated with rainfall amount and distribution. The seasonal variability of the profiles was similar in both shires, with minimum values in June and maximum values at the end of August. NDVI and EVI images showed sensitivity to seasonal variability of the vegetation and exhibited good association (e.g. r = 0.84, r = 0.77) with winter crop yields.

Full Text

June 6, 2007

3(6): Optimization of health facility locations in Osh City, Kyrgyzstan, by Teshebaeva, K.O. & Jain, S

Basic information regarding location of existing facilities, their accessibility and development trends, in relation to socio-economic structure of a city is needed in order to prepare its developmental plan. Re-location of any service may not be feasible economically, but location-allocation models can be used to identify new potential locations. This study is an attempt to simulate new potential locations and evaluate the feasibility of optimization models for planning additional health facility in the Osh city. The result of the study shows the potential of the p-median approach for optimization of the location of various public services. The contribution of GIS to optimization techniques is mainly its use as a method for data gathering and visualization of the result. These two technologies can be fully integrated to provide a powerful tool for spatial decision support.

Full Text

May 6, 2007

3(5): GIS software selection: a multi-criteria decision making approach, by Eldrandaly, K.

Building a new GIS project is a major investment. Choosing the right GIS software package is critical to the success and failure of such investment. The problem of selecting the most appropriate GIS software package for a particular GIS project is a multi criteria decision making (MCDM) problem. Solving this problem requires consideration of a comprehensive set of factors and balancing of multiple objectives in determining the suitability of particular software for building a defined GIS application. In this paper a MCDM technique, analytic hierarchy process (AHP), is used to assist system developers to select the most appropriate GIS software for a specific application. An AHP decision model is formulated and applied to a hypothetical case study to examine its feasibility in solving GIS software selection problem. The use of the proposed model indicates that it can be applied to improve the decision making process and to reduce the time taken to select a GIS software.

Full Text

April 16, 2007

3(4): GIS-based spatial analysis of child pedestrian accidents near primary schools in Montréal, Canada, by Cloutier, M., Apparicio, P.& Thouez , J.

In Montréal , Canada , accidents affecting child pedestrians (5 to 14 years old) remained almost constant from 1994 to 1999 despite the great amount of prevention measures. Moreover, the elementary public school environment has been barely taken into account by past and present research on factors affecting the risk of accident even though children attend school most weekdays. We argue here, therefore, that the integration of the local environment into the spatial analysis of child pedestrian accidents could help to reduce them. Accordingly, we have integrated socio-economic and environmental data into a geographic information system in order to perform a geographically weighted regression and results demonstrate that the average network distance separating accident and closest school is less than 500 meters, thereby confirming a relationship of proximity between these two locations. Results also demonstrate the relevance of adding a spatial dimension to the regression model by suggesting that prevention initiatives should take into account the particular nature of each neighbourhood so that more relevant risk factors can be targeted.

Full Text

March 25, 2007

3(3): The impact of neighbourhood size on the accuracy of cellular automata-based urban modelling, by Liu, Y.

Cellular automata are discrete dynamic models in which behaviour is specified in terms of local relations. This technique has recently been advantageously applied to modelling of the urban development process. However, the behaviour of the model is affected by spatial scale, including cell size and neighbourhood extent. Therefore, it is important to examine the impacts of various neighbourhood scales on the model’s behaviour and outcome. In this paper we configured a cellular automata model of urban growth in Sydney , Australia , using three different neighbourhood scales: a small neighbourhood of 1.5 cells radius, a moderate neighbourhood of 2.5 cells radius and a large neighbourhood of 3.5 cells radius, all with a fixed cell size of 250 metres. The moderate neighbourhood scale of 2.5 cells radius was found to best reflect those local mechanisms that have the most direct impact on urban development in Sydney . Hence this paper provides a useful reference in the search for a neighbourhood size that is suitable for cellular automata-based modelling of the processes of urban development.

Full Text

February 9, 2007

3(2): Prioritising areas for dugong conservation in a marine protected area using a spatially explicit population model, by Grech, A.& Marsh, H.

The Great Barrier Reef World Heritage Area (GBRWHA) covers an area of approximately 348,000km2 making it the world’s largest World Heritage Area / marine protected area complex. Dugongs (Dugong dugon) inhabit the shallow protected waters of the GBRWHA, and were an explicit reason for the region’s World Heritage listing. To manage dugongs effectively in the GBRWHA, it is critical to understand their spatial relationship with their environment and the human activities that threaten them. We demonstrate how a spatially explicit dugong population model can be used to prioritise conservation initiatives for dugongs in the GBRWHA. We used information collected from dugong aerial surveys in conjunction with geostatistical techniques, including universal kriging, to develop a model of dugong distribution and abundance. After completing the model, we conducted frequency analyses to categorise relative dugong density and distribution to identify areas of low, medium or high conservation value. As dugongs extend over a wide distributional range, prioritising areas of conservation value has the potential to be an important basis for administering management resources. We conclude that spatially explicit population models are an effective component of species conservation management, particularly for species that range over large, complex and dynamic regions.

Full Text

January 6, 2007

3(1): A methodology for spatial fuzzy reliability analysis, by Simonovic

Natural hazard risk assessment requires quantification of uncertainty that is spatially and temporally variable. Spatial variability of risk has been rarely considered in the past research. This paper presents a new methodology to capture the spatial uncertainty as well as the subjectivity associated with the natural hazard risk analysis. The fuzzy set theory has been integrated with the geographic information system (GIS) in the development of the methodology for spatial reliability analysis. Paper explores the spatial extension of three fuzzy reliability indices i.e. (1) combined reliability-vulnerability, (2) robustness, and (3) resiliency. Fuzzy risk and reliability are quantified within a GIS framework and maps showing spatial variability of three fuzzy indices are developed. The proposed methodology has been applied to flood hazard management. It has been found that the application of spatial fuzzy reliability analysis provides additional information to flood managers regarding the spatial variability of flood risk and aids in the development of a sustainable flood management options.

Full Text

December 20, 2006

2(3): Editorial -Strategic thinking for improved regional planning and natural resources management, by Victor A. Sposito, Ray Wyatt & Christopher J. Petit

This special theme issue of Applied GIS further widens the scope of our journal by edging it closer to the “policy support” pole of the GIScience continuum. It contains several creative articles showing how practicing planners of natural resources and the environmental have managed to exploit the power of GIS and remote sensing in order to improve strategic thinking. More specifically, this issue showcases innovative GIS-related work being led by the Department of Primary Industries (DPI) in the State of Victoria, Australia. All authors either work for, or collaborate with this department’s research division, which is known as Primary Industries Research Victoria ( PIRVic). Moreover, in the spirit of providing informed decision support for rural and regional policy makers, the selected articles describe and explain actual working projects that have usually generated tangible results already. This editorial begins, therefore, by outlining the basic philosophy of PIRVic . It then describes a “systems framework for spatial decision making” in which each paper can be situated. Finally, we briefly describe the nature and context of each article.

Full Text

December 8, 2006

Applying an index of adaptive capacity to climate change in north-western Victoria, Australia, by Remy Sietchiping

Climate change calls for strategic planning that builds resilience in vulnerable areas to manage the associated risks. This paper discusses how adaptive various communities and industries are to climate change in the North West of Victoria (also known as the Victorian wheat belt), Australia . Indicators of adaptive ability for communities and industries, and the importance of key drivers like government policies, expert advice and empirical evidence of developing this capacity are identified. It also incorporates input from key regional groups as well as current knowledge on adaptability of regional communities to climate change across three major themes: socio-cultural, economic, institutional/infrastructure. Each of these major themes has associated indicators, which in turn have an individual suite of measures, albeit all contributing to the overall adaptive capacity and spatial variability of these capacities. A Geographic Information System is used to collect and analyse the data and spatially represent the indicators and indices. Workshop participants’ used their ‘expert-judgment’ to assess and weight indicators, measures and themes. The stakeholders’ participatory assessment, the quantification of diversified data and interests and the importance of multiple policy outcomes make the findings locally relevant. We find that capacity and preparedness to adapt to climate change varies substantially across communities and different parts of the grains industry.

Full Text

November 22, 2006

Spatial (GIS-based) decision support system for the Westernport region, by Claudia Pelizaro & David McDonald

This paper presents the conceptual design of a spatial decision support system (SDSS) proposed for Victoria’s Westernport region that aims for the sustainable and integrated (whole-of-catchment) management of regional natural resources. It is a solution integrating a range of approaches including, GIS technology, a scenario management tool, state-of-art terrestrial and marine models, environmental management strategy evaluation and multi-criteria techniques. Traditionally, GIS are key to (spatial) data management, but lack problem domain modelling capability. This means additional processing or analytical capabilities are needed to extend functionality for decision making. The Westernport SDSS builds upon a GIS but draws on models and data processing systems and interacts with other parts of an overall information system to support decision-making. This system utilises a number of models that are interlinked through a cascade of their results. Put simply, one set of model results input into the next in a modelling chain. The system will derive a set of socio-economic-environmental measures (performance indicators), such as land use, nutrient and sediment concentration in water (water quality measures), and other relevant indicators for coastal and bay ecosystems. Users will then be able to systematically compare alternative natural resource management plans and strategies in light of multiple and possibly conflicting criteria. By integrating relevant models within a structured framework, the system will promote transparency of policy development and natural resources management.

Full Text

November 2, 2006

Using GIS in Landscape Visual Quality Assessment, by Yingxin Wu, Ian Bishop, Hemayet Hossain &Victor Sposito

Landscape Visual Quality (LVQ) ‘assessment has become a core component of landscape architecture, landscape planning and spatial planning. Different approaches for assessing the scenic qualities of landscapes have been developed in the last few decades. Two contrasting paradigms, expert/design approach and community perception-based approach, have dominated methodology development. In the expert-design approach the landscape visual quality is defined by biological and physical (or biophysical) values, while the perception-based approach emphasises the human view (subjective) of the landscape. This paper outlines a methodology combining expert and perception approaches to assess the LVQ. The application of information technology to landscape analysis dates back to the early work in computer-based mapping. Much of the early work on what became Geographic Information Systems (GIS) and three-dimensional landscape modelling was carried out by landscape architects and landscape planners. In the past years, significant advances in computers and GIS have enabled analysis of vast amounts of spatial information, which is the foundation of the methodology described in this paper. The methodology is explained in detail through its application to assess the LVQ of the Mornington Peninsula Shire, Melbourne, in the State of Victoria, Australia. There are six stages in the procedure: viewpoints selection; calculation of factor indices based on Visual Exposure Modelling; landscape preference rating; use of statistical methods (such as multiple regression model) to determine the key predictors of LVQ; application of the formula thus generated to assess the LVQ of viewpoints; and use of spatial interpolation to map LVQ across the study area. The results are discussed in the last section of this paper with reference to key methodological issues. Results show that the perceived LVQ increases with the area of water visible, the degree of wilderness and percentage of natural vegetation, and the presence of hills. On the other hand, it decreases with the presence of perceived negative human-made elements such as roads and buildings.

Full Text

October 21, 2006

Using GIS and a land use impact model to assess risk of soil erosion in West Gippsland, by Joanne McNeill, Richard MacEwan & Doug Crawford

The Land Use Impact Model (LUIM) is a spatially explicit tool developed by the Department of Primary Industries Victoria, and the University of Queensland . The LUIM has an aspatial component that incorporates knowledge of relationships between landscape characteristics and land management practices and a spatial component that uses a GIS to map where these relationships exist or are likely to exist. These data are linked in a risk assessment framework by using a Bayesian belief network (BBN) within the LUIM. The ‘soft’ data, sourced through workshops with experts and regional stakeholders are combined with the ‘hard’ biophysical data in this network, so that uncertainties or probabilities in the data are handled in the BBN. The LUIM application described in this paper shows how it was used to inform the prioritisation of actions for a Soil Erosion Management Plan in West Gippsland . Using the LUIM, maps were produced identifying areas in the West Gippsland CMA region at risk of degradation from six soil erosion processes under current land management regimes. The risk maps were used to identify ‘high value’ assets to be protected from further degradation as part of the soil erosion management plan. The LUIM fills a niche in the decision making processes for catchment management. It has the flexibility to be used at a range of scales with whatever data (hard and soft) that may be available. By combining expert opinion with hard data in a spatially explicit risk framework, priority areas can be identified and knowledge gaps highlighted. The LUIM is also adaptable to any issues that have a spatial context where natural resource assets may be threatened by degrading processes.

Full Text

October 6, 2006

GIS-based modelling of regional conservation significance, by Victor A Sposito & Elizabeth Morse-McNabb

This paper explains an approach for appraising the extent and quality of native vegetation and identifying significant habitats at strategic regional and local levels. The Vegetation and Habitat Conservation Significance Framework (hereafter the framework) is formulated through a planning process which includes seven stages from defining the ‘Purpose of the study’ (Stage 1) to ‘Implementation and monitoring’ (Stage 7). The cornerstone of the framework is the formulation, in Stage 3, of a Regional Habitat Significance Model which integrates the Analytic Hierarchy Process (AHP) with Geographic Information System (GIS). An expert workshop (Stage 4) is an integral part of model construction and should comprise 10 to 15 persons including environmental and land use scientists, ecologists, planners and landscape architects with good knowledge of vegetation, biodiversity and habitat matters, as well as relevant decision-makers. Experts are provided with all the data sets generated in Stage 2, and limitations and advantages of each data set are discussed. The initial construction of the model (undertaken at Stage 3) is validated, or modified, and then its components are weighted through consensus of the experts. The GIS platform permits the ongoing improvement and input of the latest, relevant information and the preparation of a new assessment in a cyclical planning process. The method is predominantly explained by reference to its application in the rural shire of Macedon Ranges , State of Victoria , Australia

Full Text

September 17, 2006

Evolutionary computing for optimizing a region’s distribution of agricultural production, by Ray Wyatt & Hemayet Hossain

This paper describes a GIS-based software package that incorporates a ‘genetic algorithm’ to optimize crops’ distribution across any region. Such optimization is powered by maps of where one finds the most suitable conditions for each crop, or each crop’s current local yields, market price, market demand or transport costs. Our program’s output is the crops distribution which achieves maximum economic return, or minimal environmental damage, or optimal fit with either present- or post-climatic-change soil suitability or minimum transport cost. The package can be implemented within any region where the necessary input data exists in Ascii and image format, and it incorporates a number of features that make it transparent and flexible. Such user friendliness encourages even laypersons to experiment with the genetic algorithm’s parameters, almost as if they are playing a computer game, to see whether or not they can find an even more optimal crops distribution than they found previously. The package also functions as a useful exploratory tool for seeing how current patterns would have to be modified if a more optimal crops distribution were achieved, thereby generating decision support type insights into possible repercussions of tampering with the status quo. Our package’s functionality will be demonstrated through a case study implementation within the agricultural region of South Gippsland, Australia.

Full Text

September 8, 2006

Geographical visualization: A participatory planning support tool for imagining landscape futures, by Christopher J Pettit, William Cartwright & Michael Berry

The geographical visualization of urban and regional landscapes is a powerful technique for engaging actors involved in decision-making processes. Tools developed can empower professional and citizen alike to make better-informed decisions. The paper reports on collaborative research being undertaken to develop and apply a range of 3D geographical visualization products to enhance both planning and scientific communication processes. In this paper we discuss some developments and applications of 3D geographical visualization tools and work being undertaken to evaluate the effectiveness of such tools for solving spatial planning problems. The paper concludes by discussing the lessons learnt in undertaking a cross-disciplinary approach to developing and applying landscape visualization tools and offers some future research directions with respect to technical specifications and the usefulness of geographical visualization as a participatory planning support tool.

Full Text

August 28, 2006

A strategic approach to climate change impacts and adaptation, by Victor A. Sposito

This paper describes a strategic approach to examining potential climate change impacts on agricultural, forestry and regional/rural resources. It outlines a holistic framework that links impacts with adaptation actions that is consistent with new thoughts on preparing society for climate change. The assessment of the potential impacts is a logical extension of the land resource evaluation models described in the paper by Hossain et al. (2006) since it links land suitability analysis modelling, developed by Primary Industries Research Victoria (PIRVic), with climate change impacts modelling developed by the Commonwealth Scientific and Industry Research Organisation (CSIRO). The Framework is illustrated by its applications in two regions in the State of Victoria, AUSTRALIA. The project described in this paper is part of a joint national and state effort in Australia to improve the knowledge on, and tools to adapt to, climate change and climate variability.

Full Text

August 6, 2006

Sustainable land resource assessment in regional and urban systems, by Hemayet Hossain, Victor Sposito & Carys Evans

This paper reports on two models that have been developed using ArcView Model Builder to map suitability for agricultural and urban land uses. Both models use a GIS-based multiple criteria modelling approach combining empirical data with experts’ judgement. The Agricultural Suitability Model considers soil, landscape and climate criteria; whereas the Urban Buildability Model assesses biophysical, socio-economic and spatial phenomena to define locational suitability from a sustainable development perspective. These models have been used to help develop strategic development plans for several rural shires in Victoria , Australia . One of the major objectives of these Shire development strategies was to protect good agricultural land from urban development. There are more than 30 agricultural land suitability models that have been developed to address particular commodities of interest to these Shires. Shires where these models have been applied so far include South Gippsland.

Full Text

July 18, 2006

2(2): Editorial – A brief history of metropolitan planning in Melbourne, Australia, by Tsutsumi & Ray Wyatt

… the history of Melbourne’s metropolitan planning, which we will examine here on the basis that one needs to look backwards to see where a city is currently positioned, as well as the direction in which it could go in the future. We will then highlight some of the insights provided by the theme papers, along with some of their policy-relevant implications.

Full Text

July 2, 2006

Time series analysis of the skyline and employment changes in the CBD of Melbourne, by Jun Tsutsumi & Kevin O'Connor

This paper covers historical and micro level analyses of floor space and employment in the CBD of Melbourne during the last two decades. The time-series patterns and processes of high-rise building provision in the CBD are also focused on. The CBD has experienced very complicated changes over the last two decades. While particular types of urban functions, say finance and insurance offices and many retail activities, have been dispersed to the suburbs, newly emerged activities have replaced the old and traditional ones. Despite the growth of suburban cores (office and retail), the CBD of Melbourne has still kept its strong centrality through a role as the main location of office activity in particular. Historical and micro level viewpoints provide a new understanding of the metropolitan area. The key question must be ‘Why is the role of the CBD of Melbourne still so strong, given substantial dispersal of population and economic activity to suburban locations?’

Full Text

June 26, 2006

The development of diverse suburban activity centres in Melbourne, Australia: Planning policies and retail locations, by Hiroki Yamashita, TadashiFujii &Satoru Itoh

Sustainable urban form presents the most critical problem facing most metropolitan areas following the suburbanization of urban functions in the 20th Century. Melbourne, the second major metropolitan area in Australia, has experienced motorization, creating a dispersed urban form, but has maintained its transit system and attempted to construct a compact suburban centre network. In this article, the characteristics of these suburban centres in Melbourne are analysed in detail. As a consequence, the centres are classified into the following four types: (a) Traditional centres mainly in the inner suburbs. (b) Stand-alone large suburban shopping centres. (c) Town centres formed around stations with central facilities in outer suburbs. (d) Newly-constructed centres resulting from the post 1980’s planning policies and built near stations in outer suburbs. The paper also discusses the opportunities and problems generated by this diverse range of centres in the context of the planning scheme which seeks to combine compact centres with a public transit network.

Full Text

June 12, 2006

The changing socio-economic structure of Dallas, US: The new Light Rail Transit lines and related land use change, by YuichiIshikawa & Jun Tsutsumi

This paper explores a new phase of urban development based on a case study of Dallas, Texas. The key feature of the North American metropolitan areas is the relatively weak CBD core. Many issues related to social problems, for example increasing number of immigrant shares or more non-English speaking people, often push the major urban functions out toward the suburbs and many high-density office complexes are found separate and far from the CBD as a result. In this paper, the authors examine the new efforts by city councils and other related organizations to reorganize suburbanized functions around a new public transportation system, the light rail transit (LRT). After the completion of the LRT in 1996, a new pattern of commuter flows and an increased urban development close to the newly opened stations occurred.

Full Text

May 16, 2006

A comparative study of metropolitan multi-nucleation: Suburban centres and commuter flows within the metropolitan areas of Atlanta, USA, and Melbourne, Australia, by Tadashi Fujii, Hiroki Yamashita &Satoru Itoh

The suburbanization of various functions has generated “Suburban Downtowns” or typical “Edge Cities” in Atlanta, Georgia, USA. On the other hand, Melbourne, the capital city of the state of Victoria, Australia, has managed to control its suburban centres. The CBD in Melbourne still retains comprehensive central functions for the metropolitan area, but large scale shopping malls have also been developed in the suburbs. The regional structures of these two metropolitan areas, prima facie, seem different. However, in this paper, we would like to highlight a common feature present in both areas, based on our examination of commuter flows. This phenomenon involves an interdependent, cross-suburb flow structure, which is emerging in many urban areas in the 21st century.

Full Text

May 9, 2006

GIS-based evaluation of Personal Rapid Transit (PRT) for reducing car dependence within Melbourne, Australia, by Ray Wyatt

Metropolitan Melbourne’s car dependence continues to grow. This is despite major expenditure on trams, trains and buses as well as attempts to alter land use patterns to make suburbs easier to service by public transport. Melbournians are simply reluctant to give up the flexibility, convenience, speed and privacy advantages that cars still hold over conventional public transport modes. Accordingly, we have here resurrect a less conventional public transport mode, first mooted during the 1960s and known as Personal Rapid Transit (PRT). This alternative incorporates many of the advantages of the private motor car; yet it is cheaper, faster, less polluting, quieter, safer, more convenient and less space consuming. We will use GIS to quantitatively evaluate PRT’s feasibility in Melbourne, and eventually conclude that it might be viable provided that it is implemented incrementally, and in a way that exploits current transport infrastructure. That is, we will exploit modern GIS’ ability to measure the lengths of linear features, to buffer, to intersect, to plot standard-deviation ellipses and to statistically analyse attribute tables in order to quantify PRT’s costs, as well as some of its benefits, at different localities, thereby suggesting places where PRT might best be trialled.

Full Text

April 26, 2006

Accessibility analysis for housing development in Singapore with GIS and multi-criteria analysis methods, by Xuan Zhu, Suxia Liu & Mun-ching Yeow

This paper presents a study on accessibility analysis for public housing development in Choa Chu Kang/Bukit Panjang area, Singapore, using geographical information systems (GIS) and multi-criteria analysis methods. GIS is used to measure accessibility to different types of facilities and amenities, while a multi-criteria method is employed to weight buyers’ preferences about the importance of each type of accessibility and to derive the overall attractiveness of each location for housing development in terms of these multiple types of accessibility. The results could be used to assist the housing planning and development authorities to prioritise sites where public housing should be developed in the study area.

Full Text

April 6, 2006

The spatial analysis of spectral data: extracting the neglected data, by Brian Lees

Remotely sensed data are a key input to GIS-based spatial decision support systems for land cover and land use application areas. One of the major sources of error in the input of processed remotely sensed data to GIS is in the process of classification. Particularly important is the degradation of the data from the interval to nominal level of measurement. This is less significant in cultural landscapes where boundaries predominate, but it becomes an important source of error in natural, and disturbed natural, environments where gradients exist. Use of the G i * local statistic as an alternative approach to processing remotely sensed data proved very successful, replicating the level of discrimination achieved by conventional classification and field labelling in a much shorter time, whilst avoiding the errors associated with conversion of the data from the interval to nominal level of measurement.

Full Text

March 2, 2006

2(1): Editorial, by Jim Peterson

Authors from many parts of the world are represented in this, the first issue for volume two of Applied GIS . Indeed, two of the papers have authors from across the globe, suggesting that international collaboration is surely very productive.

Full Text

February 20, 2006

Validation and sensitivity analysis of a mineral potential model using favourability functions, by Tsehaie Woldai, Alberto Pistocchi & Sharad Master

An area in the Magondi Belt, Zimbabwe, has been chosen for mineral potential mapping using the favourability functions approach. The available datasets comprising of an old geological map, a detailed airborne total magnetic field survey, and geochemical samples at the nodes of an exploration grid, have been integrated using seven different inference techniques through the joint probability function under the conditional independence hypothesis. A geological conceptual model has been adopted for the representation of the mineralization occurrence, in order to select appropriate geospatial indicators of mineralization, while existing mines in the area have been used as a training set for the model. Among the different integration techniques which have been tested, some have proven to be robust in correctly predicting all the known mine sites. It has been thus possible to draw favourability maps using existing data, which indicate the most promising areas for exploration and detailed mapping efforts for mineral exploitation. Even more important, using sensitivity analysis of the favourability functions allowed to evaluate the most important factors controlling mineralization occurrence, and thus worth additional future investigation.

Full Text

February 16, 2006

Ore grade estimation of a limestone deposit in India using an Artificial Neural Network, by S. Chatterjee, A. Bhattacherjee, B. Samanta & S. K. Pal

This study describes a method used to improve ore grade estimation in a limestone deposit in India. Ore grade estimation for the limestone deposit was complicated by the complex lithological structure of the deposit. The erratic nature of the deposit and the unavailability of adequate samples for each of the lithogical units made standard geostatistical methods of capturing the spatial variation of the deposit inadequate. This paper describes an attempt to improve the ore grade estimation through the use of a feed forward neural network (NN) model. The NN model incorporated the spatial location as well as the lithological information for modeling of the ore body. The network was made up of three layers: an input, an output and a hidden layer. The input layer consisted of three spatial coordinates (x, y and z) and nine lithotypes. The output layer comprised all the grade attributes of limestone ore including silica (SiO 2 ), alumina (Al 2 O 3 ), calcium oxide (CaO) and ferrous oxide (Fe 2 O 3 ). To justify the use of the NN in the deposit, a comparative evaluation between the NN method and the ordinary kriging was performed. This evaluation demonstrated that the NN model decisively outperformed the kriging model. After the superiority of the NN model had been established, it was used to predict the grades at an unknown grid location. Prior to constructing the grade maps, lithological maps of the deposit at the unknown grid were prepared. These lithological maps were generated using indicator kriging. The authors conclude by suggesting that the method described in this paper could be used for grade-control planning in ore deposits.

Full Text

February 9, 2006

The effect of cell resolution on depressions in Digital Elevation Models, by Paul A. Zandbergen

A proper understanding of the occurrence of depressions is necessary to understand how they affect the processing of a Digital Elevation Model (DEM) for hydrological analysis. While the effect of DEM cell resolution on common terrain derivatives has been well established, this is not well understood for depressions. The more widespread availability of high resolution DEMs derived through Light Detection and Ranging (LIDAR) technologies presents new challenges and opportunities for the characterization of depressions. A 6-meter LIDAR DEM for a study watershed in North Carolina was used to determine the effect of DEM cell resolution on the occurrence of depressions. The number of depressions was found to increase with smaller cell sizes, following an inverse power relationship. Scale-dependency was also found for the average depression surface area, average depression volume, total depression area and total depression volume. Results indicate that for this study area the amount of depressions in terms of surface area and volume is at a minimum for cell sizes around 30 to 61 meters. In this resolution range there will still be many artificial depressions, but their presence is less than at lower or higher resolutions. At finer scales, the (small) vertical error of the LIDAR DEM needs to be considered and introduces a large number of small and shallow artificial depressions. At coarser scales, the terrain variability is no longer reliably represented and a substantial number of large and sometimes deep artificial depressions is created. The results presented here support the conclusion that the use of the highest resolution and most accurate data, such as LIDAR-derived DEMs, may not result in the most reliable estimates of terrain derivatives unless proper consideration is given to the scale-dependency of the parameters being studied.

Full Text

January 6, 2006

Spatial data compression and denoising via wavelet transformation, by Biswajeet Pradhan, Sandeep Kumar, Shattri Mansor, Abdul Rahman, Ramli Abdul, Rashid B. & Mohamed Sharif

A new interpolation wavelet filter for TIN data compression has been applied in two steps, namely splitting and lifting. In the splitting step, a triangle has been divided into several sub-triangles and the elevation step has been used to ‘modify’ the point values (point coordinates for geometry) after the splitting. This data set is then compressed at the desired locations by using second-generation wavelets: scalar wavelets constructed by using a lifting scheme. Application of the compressed data compares favourably with results derived using the original (and much larger) TIN data set.

Full Text

January 5, 2006

Spatial information for Integrated Coastal Zone Management (ICZM): An example from the artificial Entrance Channel of the Gippsland Lakes, Australia, by Peter Wheeler

Since 1889, an artificial entrance channel, cut through the swash aligned Holocene sandy outer barrier system known as the Ninety Mile Beach, has provided shipping access between the Gippsland Lakes and Bass Strait. From digital spatial data analysis using GIS-built versions of the hydrographic chart archive (spanning the years 1889-2005) accumulated by port authorities, both visual and net volumetric analyses of time-series Entrance Channel morphological changes have been made. Since the mid-1970s, channel sedimentation has imposed rising demands upon maintenance dredging regimes, which is a tendency that parallels the decline in streamflow discharge from the Gippsland Lakes catchment, and correlates with the introduction of specific sediment management regimes. Clear public policy implications emerge. They refer to the continued need for provision of a navigable shipping channel that will also be of sufficient depth to allow the escape to Bass Strait of any future catchment floodwaters. It is argued that adoption of a spatial data intensive ‘Coastal Action Plan’ approach be considered in the future, to allow already established Victorian State Integrated Coastal Zone Management (ICZM) policy to be brought into practice.

Full Text

December 19, 2005

1(3): Editorial, by Jim Peterson & Ray Wyatt

The first volume of Applied GIS is now complete. I would like to thank the staff of Monash University ePress, not only for their competence and guidance but also for playing a leading part in evolving strategic approaches to publishing the results of spatial modelling so that the detail can be presented.

Full Text

December 12, 2005

Analysis of pre/post flood bathymetric change using a GIS: A case study from the Gippsland Lakes, Victoria, Australia, by Peter Wheeler

In late June 1998, a damaging flood event occurred at Lakes Entrance (Victoria, Australia), which was partly due to retardation of floodwater flow through the Gippsland Lakes artificial entrance by the level of flood-tide delta sediment accretion in the Entrance and Reeves Channels. Analyses of digitised three-dimensional hydrographic datasets allowed the extent of pre and post-flood event bathymetric change to be visualised and also quantified.

Full Text

November 14, 2005

Land use history of central Luleå: A case study in the use of historical maps together with modern geographic municipal information, by Christian Lundberg & Lynette Peterson

The modern Luleå harbour-side dates back to 1649 when the old city was abandoned because its harbour-side and approaches had become too shallow to be useful. This shallowing, due to glacio-isostatic rebound, affects the new town also, but the results have been mitigated by coastal engineering. As a result of uplift and engineering, former harbour-side land is now far enough from the present shoreline for any maritime artifacts that might lie beneath them to be unsuspected.

Full Text

October 8, 2005

Spatial data integration for classification of 3D point clouds from digital photogrammetry, by Joshphar Kunapo

Under increased urban settlement density, access to a high resolution (land-parcel scale) bare-earth Digital Elevation Model (DEM) is a pre-requisite for much decision support for planning: stormwater assessment, flood control, 3D visualisation, automatic delineation of flow paths, sub watersheds and flow networks for hydrological modelling. In these terms, a range of options face the DEM-building team. Apart from using necessarily expensive field survey, or use of out-of-date terrain information (usually in the form of digital contours of less-than-satisfactory interval) the model will be built from point-clouds. These will have been assembled via digital photogrammetry or acquisition of LiDAR data. In the first instance, both these data types soon yield a model that is known as a digital surface model (DSM). It includes any buildings, vehicles, vegetation (canopy and understory), as well as the “bare ground”. To generate the required ‘bare-earth’ DEM, ground and non-ground features/data points must be distinguished from each other so that the latter can be eliminated before DEM building. Existing methods for doing this are based on data filtering routines, and are known to produce errors of omission and commission. Moreover, their implementation is complex and time consuming.

Full Text

September 29, 2005

Modelling the driving forces of Sydney’s urban development (1971–1996) in a cellular environment, by Yan Liu &Stuart Phinn

This paper demonstrates a flexible implementation of rules to control the simulation of urban development of Sydney from 1971 to 1996 using a cellular automata model. Five key factors, including the self propensity for development and neighbourhood support, slope constraint, transportation support, terrain and coastal proximity attractions and urban planning support are introduced into the model in a spatially explicit format, which generated a realistic estimation of the extent and timing of Sydney’s urban development. With the flexibility of rule implementation within the model, more rules can be added as new ‘If-Then’ statements to fine-tune the model, provided that a good understanding of the rule is maintained and accurate data are collected.

Full Text

September 17, 2005

Bathymetric evolution at a coastal inlet after channel-edge groyne emplacement: A case study from the Gippsland Lakes, Victoria, Australia, by Peter Wheeler

Digital capture and analyses of time-series (1941-2005) digital elevation models (DEMs), developed for the Gippsland Lakes artificial entrance area (situated in Victoria, Australia) from analogue hydrographic charts, allows the long-term bathymetric results of rubble training wall (or ‘groyne’) emplacement in the Reeves Channel to be examined. Reeves Channel form has progressively become sinuous, and extensive flood-tide delta shoaling areas have also developed since ‘groyne field’ installation. It is argued that deviance from original Reeves Channel groyne emplacement design (proposed by a Royal Commission report in 1927) may have contributed heavily to the time-series development of Reeves Channel sinuosity and flood-tide delta accretion.

Full Text

September 16, 2005

1(2): Foreword, by Gregoire Dubois

Real-time analysis of data reported by environmental monitoring networks poses a number of interesting challenges, one of which is the handling of point measurements of phenomena that display some spatial continuity. This is the case for many variables, such as atmospheric and aquatic pollutant levels, background radiation levels, rainfall fields, temperature and seismic activity, to name but a few.

Full Text

September 8, 2005

Introduction to the Spatial Interpolation Comparison (SIC) 2004 Exercise and Presentation of the Data sets, by G. Dubois & S. Galmarini

The Spatial Interpolation Comparison (SIC) 2004 exercise was organised during the summer 2004 to assess the current know-how in the field of “automatic mapping”. The underlying idea was to explore the way algorithms designed for spatial interpolation can automatically generate maps on the basis of information collected regularly by monitoring networks. Participants to this exercise were invited to use some prior information to design their algorithms and to test them by applying the software code to two given datasets. Estimation errors were used to assess the relative performances of the algorithms proposed. Participants were not only invited to minimize estimation errors but also to design the algorithms so as to render them suitable for decision-support systems used in emergency situations. The data used in this exercise were daily mean values of gamma dose rates measured in Germany. This paper presents the exercise and the data used more in detail.

Full Text

September 7, 2005

Using Ordinary Kriging to Model Radioactive Contamination Data, by Elena Savelieva

This paper deals with an application of ordinary kriging (OK) for spatial interpolation of data in a completely automatic (“one-click mapping”) manner. The important set of kriging parameters (semivariogram model, search strategy, etc.) were tuned based on the prior characteristics of the phenomenon considered. The prior information provided as 10 sets of monitoring observations taken at different days was used to analyse and model the spatial correlation of the phenomenon. Furthermore, the prior information was expected to be consistent within a rather long time range and therefore assumed to reflect the structure of the contamination pattern at any given day. The approach applied here gave satisfactory results for both routine and emergency data sets. The benefits and drawbacks of the kriging model were well illustrated in the study. Ordinary kriging can be considered as a real candidate for the implementation in a decision support system.

Full Text

September 4, 2005

Mapping Radioactivity from Monitoring Data: Automating the Classical Geostatistical Approach, by Edzer J. Pebesma

In the context of a comparison of spatial prediction algorithms, we applied the classical geostatistical approach to see how well it would automate, and how well it performed in case of an unexpected anomaly. In case of a test without anomaly, the method performed well. In the anomaly case, automatic variogram modelling was hindered seriously, and in terms of RMSE best results were obtained by using the variogram from the test data without the anomaly. Although the 10 days of available training data showed a strong temporally persistent spatial pattern, cokriging did not improve predictions.

Full Text

August 12, 2005

Automatic Mapping in the Presence of Substitutive Errors: A Robust Kriging Approach, by Baptiste Fournier & Reinhard Furrer

Interpolation of a spatially correlated random process is used in many scientific domains. The best unbiased linear predictor (BLUP), often called kriging predictor in geostatistical science, is sensitive to outliers. The literature contains a few attempts to robustify the kriging predictor, however none of them is completely satisfactory. In this article, we present a new robust linear predictor for a substitutive error model. First, we derive a BLUP, which is computationally very expensive even for moderate sample sizes. A forward search type algorithm is used to derive the predictor resulting in a linear likelihood-weighted mean procedure that is robust with respect to substitutive errors. Monte Carlo simulations support the theoretical results. The new predictor is applied to the two SIC2004 data sets and is evaluated with respect to automatic interpolation and monitoring.

Full Text

August 9, 2005

Automatic Mapping of Monitoring Data, by Søren Lophaven, Hans Bruun Nielsen & Jacob Søndergaard

This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two datasets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable of identifying a release of radioactivity provided that the number of sampling locations is sufficiently high. Consequently, we believe that a combination of applying the presented mapping approach and the physical knowledge of the transport processes of radioactivity should be used to predict the extreme values.

Full Text

August 1, 2005

Bayesian Automating Fitting Functions for Spatial Predictions, by Monica Palaseanu-Lovejoy

A Bayesian predictive model for automating mapping of background radiation has the advantage of fully accounting for all uncertainties in the inferred data. Ten training datasets of background radiation were used to set up the model. The model is robust for data containing only close outliers but fails to accurately predict values when the input data is contaminated with extreme outliers, which are the result of a different random underlying process than the background data. For an integrated decision support system for automating mapping when data contamination is expected, a two stage approach is required in which background data are modeled with one set of equations and the contaminated data with a different set of equations.

Full Text

July 27, 2005

Fast Spatial Interpolation using Sparse Gaussian Processes, by Ben Ingram, Lehel Csató & David Evans

The estimation of the natural ambient radioactivity in this entry to the Spatial Interpolation Comparison 2004 (SIC2004) uses Gaussian processes (GP’s) to predict the underlying dispersal process. GP’s enable us to predict easily levels of radioactivity at previously unseen locations and in addition they allow us to assess the uncertainty in the predicted value. To speed up computation time, which is cubic in the number of examples, a sequential, sparse implementation of the Gaussian process inference (SSGP) was used together with a Gaussian observational noise assumption. The examination of the available data led to a covariance function which is a mixture of exponential and squared-exponential functions. The mixture was chosen so that it incorporates both the local ambiguity in the data and also at the same time it captures the larger-scale variation of the observations.

Full Text

July 13, 2005

Interpolation of Radioactivity Data Using Regularized Spline with Tension, by Jaroslav Hofierka

Regularized Spline with Tension was used to interpolate two data sets representing radioactivity measurements at 200 locations. A cross-validation analysis showed that the size of the training data sets was too low to find optimal parameters using the cross-validation procedure. The resulting surfaces were strongly smoothed and less realistic than expected. Therefore empirical interpolation parameters were used to interpolate the data. Despite the fact that this empirical selection did not produced interpolation results with a lower overall predictive error, it preserved better local fluctuations and anomalies of the phenomenon. The detection of these features is important in radioactivity monitoring and emergency situations. The poor reliability of cross-validation was also confirmed by evaluation data set. It was concluded that the optimization of interpolation parameters cannot rely on cross-validation when the modeled phenomenon is not sufficiently sampled. The sampling density should be sufficient to represent spatial variations of the phenomenon and, at the same time, allow the optimization of interpolation parameters using automated procedures.

Full Text

July 4, 2005

Automated Mapping using Multilevel B-Splines, by Anatoly A. Saveliev, Andrey V. Romanov & Svetlana S. Mukharamova

This paper describes application of multilevel B-Splines approximation (MBA) algorithm to the SIC2004 exercise as a high performance automatic mapping means in emergency situations. MBA method was compared with kriging, and various flexible MBA method’s tuning “controls” were proposed and discussed. Outliers automatic detection and delineation techniques considered could be used to process outliers without the loss in performance. Prior data were used to adjust method parameters and discover the pattern of spatial correlation. The interpolation algorithm development did not assume any information about the phenomena in addition to values given.

Full Text

June 28, 2005

Spatial Interpolation of Natural Radiation Levels with Prior Information using Back-propagation Artificial Neural Networks, by J.P. Rigol-Sanchez

We propose artificial neural networks (ANNs) as a tool for automatic mapping of daily observations of environmental data. A feed-forward back-propagation neural network for estimating daily natural radiation measurements at unsampled locations using prior information was developed. Feed-forward back-propagation networks were trained to learn: (a) the relationship between daily measurements and their spatial coordinates, and (b) the relationship between daily measurements made at one site and measurements made at the six surrounding closest sites. Results of the study indicate that ANNs can be used for automatic mapping of environmental (background) data with moderate success. ANN models for spatial interpolation can successfully incorporate prior information into the estimation process. However, the ANN approach to automatic mapping of environmental data presented here was clearly inappropriate for dealing with outliers. Results obtained suggest that developing two different models for estimating background values and extreme values, respectively, might be a potentially more successful approach to automatic mapping of environmental data.

Full Text

June 13, 2005

Spatial Prediction of Radioactivity using General Regression Neural Network, by VadimTimonin, Elena Savelieva

This work describes an application of General Regression Neural Network (GRNN) to spatial predictions of radioactivity. GRNN belongs to a class of neural networks widely used for mapping continuous functions. It is based on a non-parametric (kernel) Parzen-Rosenblatt density estimator. The kernel size is the only tuning parameter, and it allows the user to implement a GRNN in an automatic mode. An important advantage of the GRNN is its very simple and fast training procedure. The most important drawbacks are high smoothing and dependence on the spatial density of the monitoring data set. The current case study is performed on the SIC2004 data sets, and the results obtained here can be compared with those obtained by the other participants using other approaches.

Full Text

June 1, 2005

Investigation of Two Neural Network Methods in an Automatic Mapping Exercise, by Sridhar Dutta, Rajive Ganguli & Biswajit Samanta

This paper investigates the performance of two neural network (NN) methods viz. a radial basis function network (RBFN) and a multilayer feed forward network (MFFN) to predict the radioactivity levels at a given test site. A comparative evaluation of the two networks is done using Root mean square error (RMSE), Pearson’s r , Mean error (ME) and Mean Absolute error (MAE). It was found that the RBFN performed marginally better compared to the other method.

Full Text

May 30, 2005

Support Vector Regression for Automated Robust Spatial Mapping of Natural Radioactivity, by Alexei Pozdnoukhov

This paper presents an application of Support Vector Regression method for the prediction of an environmental variable such as the level of natural radioactivity. The basics of the method are described, and some practical considerations are presented, including the meaning of the method’s parameters and their influence on the model. The use of the prior data is discussed. It is shown how to include the information on the variance of the measurements into the model. The use of cross-validation for tuning the parameters of the algorithm is presented. Some ideas for detecting the unusual training samples with SVR are discussed. Generally, the case study illustrates the usefulness of the considered approach for automated spatial mapping tasks in the presence of prior data.

Full Text

May 15, 2005

1(1): Editorial, by Jim Peterson

Applied GIS – another peer-reviewed applied (social and environmental) science research journal to keep up with? Well, yes. However, Applied GIS meets a number of urgent needs in an admittedly crowded science-journal market.

Full Text

May 3, 2005

Harbour-side land parcel land-use changes: Significance in re-development planning permit appraisal in Melbourne, Australia, by Damien Luxford & Shobhit Chandra

Results are presented of spatial query at the land parcel level of a (digital) spatial database, referring to the twentieth century Harbour-side zone of the Port of Melbourne. They refer mainly to land-use histories of individual land parcels. Such histories can be now deployed routinely in decision support during assessment of heritage potential in aid of re-development/excavation permit appraisal.

Full Text

April 30, 2005

Suburban socio-spatial polarisation and house price change in Melbourne: 1986-1996, by Margaret Reynolds & Maryann Wulff

This study examines the process and pattern of spatial polarisation in Melbourne Australia between 1986 and 1996. We construct a five-category polarisation typology based on the relative change in the bottom and top ends of local suburban household income distributions. The suburbs are classified as either: increasing advantage, increasing middle income, stable, polarising or increasing disadvantage. The research then examines the relationship between the suburb classification and house prices over the same period. The spatial units are 327 Melbourne suburbs. The two primary data sources include Australian Bureau of Statistics (ABS) household income figures and Victorian state government house sale price data for 1986 and 1996. The maps reveal a contiguous sector of increasing advantage in Melbourne’s inner and nearby eastern suburbs encircled by an adjacent middle suburban ring characterised by growing disadvantage. Spatially this picture of polarisation corresponds closely with the map showing median house price change between 1986 and 1996. The polarisation categories are closely related to real quartile house prices with the highest house price increases in suburbs of increasing advantage and the lowest gains (or declines) in suburbs increasing in disadvantage.

Full Text

April 12, 2005

Improving spatial decision support systems: Methodological developments for natural resources and land management, by Hedia Chakroun & Goze Ber

The use of geographic information systems (GIS) in the past two decades helped in formulating and solving spatial decision-making problems. In spite of their huge capacities in the acquisition and the storage of spatial data, GIS have some limits when it is a matter of solving semi-structured problems that represent most real-world spatial decision cases. The improvement of GIS analytic capacities can provide support required in multiple decision-making phases. We use advanced spatial analysis techniques applied to raster data representing a set of constraints that may be encountered in a land management project. Digital maps and a digital elevation model (DEM) have been used to produce the constraint spatial database for the case study. Each spatial feature had been subject to an evaluation process and a utility value was given to represent its tolerance to the management project according to the constraints identified previously. Results obtained from this methodology have been compared to conventional cases of suitability mapping from the original set of constraint maps. Results show that suitability maps for the management project derived from this study represent multiple scenarios leading to the improvement of the design and choice phases of decision-making process.

Full Text

March 11, 2005

Towards automation of impervious surface mapping using high resolution orthophoto, by Joshphar Kunapo, Pua Tai

Information on the amount and pattern of impervious surface is important for hydrological modelling of urban areas. As cities expand and/or develop, hydrologic models will become outdated unless information on impervious surfaces is kept up to date. At the moment, the mapping teams are faced with choosing from among a range of alternative approaches/tools/software products to achieve this. We report here, the results of experiments conducted for a range of mapping approaches applied to high resolution orthophoto imagery covering part of the residential zone of Monash City, a local government area in Melbourne, Victoria, Australia. The application of the Expert Classification (EC) or Feature Analyst (FA) approaches requires initial human involvement to set the knowledge/learner function, which, then can be applied to any areas of similar spectral patterns. The Feature Analyst (FA) approach yielded superior results compared these ‘pixel-by-pixel’ methods. All of these approaches refer to mapping in aid of distributed and connectivity modelling. In the absence of access to EC or FA tools, application of the (sampling-based) Precision Method (PM), (after careful consideration of sampling stratification) will offer total impervious surface data-input estimates for lumped hydrological modelling.

Full Text