UNIT - 4
DATA QUALITY AND STANDARDS
4.1. VECTOR DATA ANALYSIS TOOLS
A systematic examination of a problem or complex entity in order to provide new
information from what is already known (ESRI – GIS Dictionary)
Spatial analysis is the process that turns raw input spatial data into value-added useful data
or information that ultimately supports decision making and reveals hidden or unknown patterns
Common vector analysis tools are:
• Buffering
• Overlay
• Other feature manipulation tools
Buffering
• Buffer – a region that is less than or equal to a specified distance
• Can buffer points, lines and polygons used to examine proximity constraints?
Example:
• Identify potential customers within 3km of store
• Identify parks within 10km of Islamabad Highway
• Identify schools within 5 km of industrial zone
4.1.1. Buffer Analysis:
• A buffer is a zone with a width created around a spatial feature and is measured in
units of distance from the feature. The generated buffer takes the shape of the feature.
• In case of a point the buffer is a circle (refer Figure 4.1 (a), with a radius equal to the
buffer distance. In case of a line (refer Figure 4.1 (b), it is a band and for a polygon it
is a belt of a specific buffer distance from the edge of polygon, surrounding the
polygon.
• The inward buffer for a polygon is called setback (refer Figure 4.1 (c), the polygon on
the right hand side).
4.2 GIS
• Buffering is used for neighborhood analysis which aims to evaluate the characteristics
of the area surrounding the spatial feature.
Figure.4.1. Buffer Analysis Diagram
• Common examples of buffering include the identification of properties within a
certain distance of an object, delineation of areas around natural features where human
activities are restricted, determination of areas affected by location etc.
• Clip is used to subset a point, line or a polygon theme using another polygon theme as
the boundary of the area of interest.
• In the illustration above, the input, point feature shows the location of drinking water
wells in three villages as shown in Figure 4.2.
• To know how many wells fall in village 1, the input feature class is clipped using the
boundary of the village 1. The output feature class shows that five wells are present in
village 1(One).
Figure.4.2. Drinking Water Wells in Three Villages
• Split causes the input features to form subset of multiple output feature classes. The
split field’s unique values form the names of the output feature classes.
Fig.4.3. Polygon theme of Watershed Boundaries
Data Quality and Standards 4.3
• In the illustration above, a point theme of wells is split using the polygon theme of
watershed boundaries as shown in Figure 4.3.
• The output of this operation contains multiple feature classes which are named on the
unique value of watershed boundaries (in this case, the unique value is the watershed
number WS1, WS2 etc.).
• Each output class represents the number of wells present in a particular watershed i.e.
WS1 or watershed 1 has three wells. Similarly, WS2, WS3 and WS4 have 3, 2, and 2
wells respectively.
4.1.2. Overlay Analysis
• Union creates a new theme by overlaying two polygon themes. It is same as ‘or’
Boolean operator. The output theme contains the combined polygons and attributes of
both themes. Only polygon themes can be combined using union as shown in Figure
4.4.
.
Fig.4.4. Overlay Analysis – Union
• Let’s say we are interested in knowing no potential zone for urban development. It is
clear that no construction can be done on a water body or land covered by agriculture
or forest. So, we can say union of areas under water, agriculture and forest would
present us the area having no potential for urban development.
• Intersect creates a new theme by overlaying a point, line or polygon theme with an
intersecting polygon theme. It is same as ‘and’ Boolean operator. The output theme
contains only the feature inside the intersecting polygons.
Fig.4.5. Overlay Analysis – Intersect
• From the same example given above, if we try to know the area having potential for
• Urban development we need to intersect the polygon themes to get a common area
which is not under water, agriculture or forest as shown in Figure 4.5.
4.4 GIS
• From the same example given above, if we try to know the area having potential for
urban development we need to intersect the polygon themes to get a common area
which is not under water, agriculture or forest.
4.1.3. Feature Manipulation
Dissolve
Append
Select
Clip
Data Quality and Standards 4.5
Erase
Split
4.2. DATA ANALYSIS TOOLS
• The Analysis toolbox contains a powerful set of tools that perform the most
fundamental GIS operations. With the tools in this toolbox, you can perform overlays,
create buffers, calculate statistics, perform proximity analysis, and much more.
Whenever you need to solve a spatial or statistical problem, you should always look in
the Analysis toolbox.
• The Analysis toolbox has four toolsets. Each toolset performs specific GIS analysis of
feature data.
Toolsets Description
Extract GIS datasets often contain more data than you need. The Extract tools let you
select features and attributes in a feature class or table based on a query (SQL
expression) or spatial extraction. The output features and attributes are stored in a
feature class or table.
Overlay The Overlay toolset contains tools to overlay multiple feature classes to combine,
erase, modify, or update spatial features, resulting in a new feature class. New
information is created when overlaying one set of features with another. There are
six types of overlay operations; all involve joining two existing sets of features
into a single set of features to identify spatial relationships between the input
features.
Proximity The Proximity toolset contains tools that are used to determine the proximity of
features within one or more feature classes or between two feature classes. These
tools can identify features that are closest to one another or calculate the distances
between or around them.
4.6 GIS
Statistics The Statistics toolset contains tools that perform standard statistical analysis
(such as mean, minimum, maximum, and standard deviation) on attribute data as
well as tools that calculate area, length, and count statistics for overlapping and
neighboring features.
4.3. NETWORK ANALYSIS
It is a type of line analysis which involves set of interconnected lines. Railways, highways,
transportation routes, rivers etc are examples of networks. Network analysis is used to find the
shortest alternated routes between origins to destination Network Analyst provides network based
spatial analysis tools for solving complex routing problems.
Modeling in GIS Highway alignment studies:
4.3.1. Highway Alignment
The position or the layout of the centre line of the highway on the ground is called the
alignment. The Horizontal Alignment includes the straight path, the horizontal deviations and
curves. Changes in gradient curves are covered under vertical alignment of roads.
A new road should be aligned very carefully as improper alignment would result in one or
more of the following disadvantages:
1) Increase in construction cost
2) Increase in maintenance cost
3) Increase in vehicles operation cost
4) Increase in accident rate.
The basic requirements of an ideal alignment between two terminal stations are that it
should be:
1) Short 2) Easy 3) Safe 4) Economical
4.3.2. Factor affecting Highway Alignment:
The various factors which control the highway alignment in general may be listed as:
1) Obligatory points
2) Traffic
3) Geometric design
4) Economics
5) Other considerations In hill roads additional care has to be given for: Stability-
6) Drainage
Data Quality and Standards 4.7
4.3.3. Stages of New Highway Project:
1) Selection of route, finalization of highway alignment and geometric design details.
2) Collection of materials and testing of subgrade soil and other construction material,
mix design of pavement materials and design details of pavement layer.
3) Construction stages including quality control.
4.3.4. Steps Involved in a New Highway Project:
1) Map study
2) Reconnaissance Survey
3) Preliminary survey
4) Location of Final Alignment
5) Detailed survey
6) Material survey
7) Design
8) Earth work
9) Pavement Construction
10) Construction
4.3.5. Need of Study
The conventional method of highway alignment is a tedious and time consuming process
• The conventional highway alignment needs a lot of manual work and expensive
• Remote sensing and Geographical information system makes the highway alignment
easier. It needs less man power,
• less time consuming and economic.
4.3.6. Objectives of study
The objectives of the present study are as follows,
• To identify the factors that influence on highway alignment studies
• To prepare the thematic layers based on the identified factors
• To analyses the traffic volume and future expansion.
• To identify the favorable route for highway alignment.
4.3.7. Methodology
• The base (study area) map, Drainage, Slope and Contour maps were prepared with the
help of SOI Topo-sheet (on 1:50,000 scale).
4.8 GIS
• IRS LISS III satellite data was used and by using Digital Image Processing techniques
the following thematic maps such as geomorphology, Land use/ Land Cover were
generated as shown in the figure.5.22.
• The DEM is used in order to understand the terrain condition, environmental factors
and social economic status in this study area.
• The factors considered are mainly related to the land use, geology, land value and soil.
The weights and ranks are assigned to each of the above themes, according to expert
opinions, for GIS analysis. After assigning weights and ranks these themes are
overlaid to get an overlaid map.
• Finally, possible/feasible route was identified based on various physical and cultural
parameters and their inherent properties.
• The cost reduction analysis was also done for substantiating the formation of highway.
Finally, possible/feasible route was identified based on various physical and cultural
parameters and their inherent properties. The cost reduction analysis was also done for
substantiating the formation of highway.
Figure.5.22. Methodology
• The main purpose of traffic survey are traffic monitoring, traffic control and
management, traffic enforcement, traffic forecasting, model calibration and validating
etc.
• The purpose of caring out traffic volume count are designing, improving traffic
system, planning, management.
• The traffic volume count study is carried out to get following useful information.
Magnitudes, classifications and the time and directional split of vehicular flows
Data Quality and Standards 4.9
• Proportion of vehicles in traffic stream - Hourly, daily, yearly and seasonal variation
of vehicular flows
• Flow fluctuation on different approaches at a junction or different parts of a road
network system.
• Network analysis is used to find the shortest alternated routes between origins to
destination Network Analyst provides network-based spatial analysis tools for solving
complex routing problems.
4.3.8. Conclusion
• The purpose of this study was to develop a tool to locate a suitable less time
consuming, shortest route between two points.
• The GIS approach using ground parameters and spatial analysis provided to achieve
this goal. Raster based map analysis provide a wealth of capabilities for incorporating
terrain information surrounding linear infrastructure.
• Costs resulting from terrain, geomorphology, land use, drainage and elevation
resulting the shortest routes for the study area.
• Results indicate that the route which was designed applying GIS method is avoid
traffic problems, less time consuming more environmentally effective, and cheaper.
• This proposed shortest route provides traffic free, pollution free, risk free, operating
for movement of vehicle passing from chettikullam to kottar.
• Time and consumption of fuel will also be reduced considerably. GIS method can also
be used for route determination for irrigation, drainage channels, power lines and
railways.
4.4. TWO MODELS OF DIGITAL EDUCATION
From the introduction of the World Wide Web in 1993 the young of the world have
experienced two models of digital education, that outside the school walls and that within.
Outside the young and the digitally connected families of the world employed – unseen –
the naturally evolving laissez faire model. Within the school the young worked within the
traditional, highly structured model.
4.10 GIS
It is time the difference is understood, the global success and benefits of the laissez faire
recognised and lauded, and the serious shortcomings of the highly structured understood and
addressed.
For much of the period the two models ran in parallel, with most schools showing little or
no interest in the out of school digital education.
Around 2010 – 2012 the scene began to change when a handful of digitally mature schools
began genuinely collaborating with their families in the 24/7/365 digital education of the children.
Those schools had reached the evolutionary stage where their teaching model and culture closely
mirrored that of the families. They revealed what was possible with collaboration.
That said it took time for that collaboration to take hold more widely and for the most part
the parallel models continue in operation today, with the difference between the in and out of
school teaching growing at pace.
It is surely time for schools and government to question the retention of the parallel modes
and to ask if taxpayers are getting value for the millions upon millions spent solely on schools
when the digitally connected families receive no support.
Might it be time to employ a more collaborative approach where the schools complement
and add value to the contribution of the families?
Without going into detail, it bears reflecting on the distinguishing features of the learning
environment and digital education model, of both the digitally connected family and the school,
and asking what is the best way forward,
4.4.1. The learning environments.
Digitally connected families
That of the families we know well. It has been built around the home’s warmth and
support, and the priority the parents attached to their children having a digital education that
would improve their education and life chances. The focus has always been on the child – the
individual learner – with the children from the outset being provided the current technology by
their family and empowered to use that technology largely unfettered.
Importantly the family as a small regulating unit, with direct responsibility for a small
number of children could readily trust each, and monitor, guide and value their learning from birth
onwards, assisting ensure each child had use of the current technology and that the use was wise
and balanced.
The learning occurred within a freewheeling, dynamic, market driven, naturally evolving
environment, anywhere, anytime, just in time and invariably in context. Those interested could
operate at the cutting edge and the depth desired.
Very early on the young’s use of the digital was normalized, with the learning occurring as
a natural part of life, totally integrated, with no regard for boundaries
Data Quality and Standards 4.11
The time available to the digitally connected family was – and continues to be – at least
four/five times greater than that in the school.
It was too many seemingly chaotic, but also naturally evolving.
Very quickly the family learning environment became collaborative, socially networked,
global in its outlook, highly enjoyable and creative where the young believed anything was
possible.
By the latter 2000’s most families had created – largely unwittingly – their own
increasingly integrated and sophisticated digital ecosystem, operating in the main on the personal
mobile devices that connected all in the family to all manner of other ecosystems globally.
4.4.2. Digital learning in the school.
The general feature of the school digital learning environment has been invariably one of
unilateral control, where the ICT experts controlled every facet of the technology and its teaching.
They chose, configured and controlled the use of both the hardware and software,
invariably opting for one device, one operating system and a standard suite of applications.
The students were taught within class groups, using highly structured, sequential, teacher
directed, regularly assessed instructional programs.
The school knew best. The clients – the parents and students – were expected to acquiesce.
There was little or no recognition of the out of school learning or technology or desire to
collaborate with the digitally connected families.
The teaching was insular, inward looking, highly site fixated.
In reflecting on school’s teaching with the digital between 1993 and 2016 there was an all-
pervasive sense of constancy, continuity, with no real rush to change. There was little sense that
the schools were readying the total student body to thrive within in a rapidly evolving digitally
based world.
Significantly by 2016 only a relatively small proportion of schools globally were
operating as mature digital organizations, growing increasingly integrated, powerful higher order
digitally based ecosystems.
The reality was that while the learning environment of the digitally connected families
evolved naturally at pace that of most schools changed only little, with most schools struggling to
accommodate rapid digital evolution and transformation.
4.4.3. The teaching models
With the advantage of hindsight, it is quite remarkable how hidden the laissez faire model
has remained for twenty plus years, bearing in mind it has been employed globally since the
advent of the WWW.
4.12 GIS
For years, it was seen simply as a different, largely chaotic approach used by the kids –
with the focus being on the technological breakthroughs and the changing practices rather than on
the underlying model of learning that was being employed.
It wasn’t until the authors identified and documented the lead role of the digitally
connected families of the world did we appreciate all were using basically the same learning
approach. The pre-primary developments of the last few years affirmed the global application of
the model.
We saw at play a natural model that was embraced by the diverse families of the world.
All were using the same model – a naturally evolving model where the parents were
‘letting things take their own course ‘(OED).
The learning was highly individualized, with no controls other than the occasional parent
nudge. That said the learning was simultaneously highly collegial, with the young calling upon
and collaborating with their siblings, family members, peers and social networks when desired.
Interestingly from early on the young found themselves often knowing more about the
technology in some areas than their elders – experiencing what Tapscott (1998) termed an
‘inverted authority’ – being able to assist them use the technology.
Each child was free to learn how to use, and apply those aspects of the desired
technologies they wanted, and to draw upon any resources or people if needed.
In the process the children worldwide – from as young as two – directed their own
learning, opting usually for a discovery based approach, where the learning occurred anytime,
anywhere 24/7/365. Most of the learning was just in time, done in context and was current,
relevant, highly appealing and intrinsically motivating. Invariably it was highly integrated, with
no thought given to old boundaries – like was it educational, entertainment, communication, social
science or history.
In contrast the school digital teaching model has always been highly structured and
focused on what the school or education authority ‘experts’ believed to be appropriate.
Throughout the period the teaching has been unilaterally controlled, directed by the
classroom teacher, with the students disempowered, distrusted and obliged to do as told.
The teaching built upon linear, sequential instructional programs where the digital
education was invariably treated like all other subjects, shoehorned into an already crowded
curriculum and continually assessed. Some authorities made the ‘subject’ compulsory, others
made it optional.
The focus – in keeping with the other ‘subjects’ in the curriculum – was academic. There
was little interest in providing the young the digital understanding for everyday life.
The teaching took place within a cyber walled community, at the time determined by the
teaching program.
Data Quality and Standards 4.13
Increasingly the course taught and assessed became dated and irrelevant.
In considering why the young and the digitally connected families of the world have
embraced the laissez faire model of digital education aside from the young’s innate curiosity and
desire to learn we might do well to examine the model of digital learning we have used over the
last twenty plus years and reflect on how closely it approximates that adopted by the young.
Might they be following that ancient practice of modelling the behavior of their parents?
4.4.4. The way forward
Near a quarter of a century on since the introduction of the WWW and an era of profound
technological and social change it is surely time for governments and educators globally to
• Publicly recognise the remarkable success of the digitally connected families and the
laissez faire teaching model in the 24/7/365 digital education of both the children and
the wider family
• Understand the digitally connected families are on trend to play an even greater lead
role
• identify how best to support the family’s efforts without damaging the very successful
teaching model employed
• Consider how best to enhance the educational contribution of all the digitally
connected families in the nation, including the educationally disadvantaged
• Rethink the existing, somewhat questionable contribution of most schools and the
concept of schools as the sole provider of digital education for the young
• Examine where scarce taxpayer monies can best be used to improve the digital
education in the networked world.
Let us all finally recognize the core qualities and the remarkable global success of the
laissez faire digital education model and build upon its achievements.
4.5. 3D DATA COLLECTION AND UTILIZATION
Geographic Information Science (GIS) offers powerful tools for performing detailed
analysis of spatial information and solving complex problems. Traditional GIS data is based on
mapping in two dimensions, an x and y-value, which can be limiting in some applications.
Utilizing 3D GIS software lets users engage with data from a whole new perspective that results in
more nuanced insights and detailed visualizations.
• 3D GIS bring enhanced depth into data collection and analysis by incorporating a z-
value into mapping. Most commonly, that means including elevation data, but users
have many options for adding layers of information. For instance, a map might include
a dimension based on the concentrations of certain chemicals and minerals or which
parcels of land are best suited for development. Working with three dimensions, GIS
professionals can often apply their findings to address real-world issues with greater
accuracy.
• While 3D models are more difficult to create and maintain than 2D ones, there are
myriad 3D GIS applications where this technology is greatly beneficial. These four
4.14 GIS
examples demonstrate how an investment in 3D GIS modeling can generate added
value:
4.5.1. City Planning
Cities have a way of growing to encompass previously under- or undeveloped areas in a
process often called urbanization or urban sprawl. There are many reasons behind urban sprawl,
including a desire to build improved infrastructure, affordable land or tax rates, or overcrowding
inside the city. Urban sprawl can have a major impact on people who decide to leave the city as
well as those who remain. For example, as residents move farther away from the city center,
infrastructure such as roads or public transportation systems must accommodate their commutes,
and traffic can lead to higher rates of air pollution.
To minimize the negative impacts of urban sprawl and increased development, it’s
important for city planners to carefully determine the best way to grow urban areas. Urban
development needs to take into consideration today’s requirements, potential changes in demand
and the long-term effects of building upward and outward.
3D GIS software can help city planners visualize what their proposed changes will look
like and predict the outcomes for current residents and future generations. One example was the
2012 revitalization of the Mulheim Sud district in Cologne, Germany, located on the Rhine River.
The project set out to make the district, which included a mix of residential, commercial and
industrial buildings, more environmentally friendly over the course of two decades.
A 3D model spotlighted building information, aerial photos, energy performance, air
pollution, lidar elevation data, noise and traffic. The wide range of integrated information allowed
architects, planning engineers and others to collaborate effectively. As the district develops
further, the 3D model will help future planners with energy and environmental modeling and
guide public participation initiatives.
4.5.2. Building Information Modeling
Building information modeling (BIM) is a technology that generates digital
representations of facilities and relevant processes. BIM has given facilities managers the ability
to closely review structures, beginning with the construction planning phase.
Data Quality and Standards 4.15
Used in conjunction with 3D GIS data, BIM can help create robust building management
plans and allow for more detailed analysis. For example, before breaking ground on a construction
project, stakeholders can review findings from GIS and BIM to draw conclusions about
environmental impact, sustainability, disaster readiness and how to optimize the use of assets and
space.
BIM and 3D GIS can also come together to support the preservation and restoration of
historical buildings. An effort to digitally record cultural heritage sites in Dublin, Ireland, drew on
Historic Building Information Modeling (HBIM) and 3D GIS to document and analyze selected
locations.
One project focused on restoring buildings along Henrietta Street, which dates its earliest
construction to 1730. By the 21st century, the street was lined with buildings in serious need of
care. HBIM technology made it possible to map the extent of the damage and visualize what the
area looked like when new. Researchers employed GIS tools to note attributes of individual
buildings like the years of construction and address information. In the process, they developed a
store of information that could be used to generate in-depth visualizations or guide tourists.
4.5.3. Coastal Modeling and Analysis
A nation’s coastline is a crucial gateway for imports and exports, and about 40 percent of
the world’s population lives within 60 miles of a coast. But these areas also pose numerous
challenges for development.
Its critical for planners to understand the factors that affect construction and maintenance
for shipping ports, fisheries, mineral mining operations and wilderness preservation areas.
Responsible coastal development must be informed by underwater topography, local vegetation
and predictions for the long-term environmental impact.
Resource planning systems that draw on GIS can provide insights into the economic,
environmental and cultural results of activities along the coast. The right data makes all the
difference in sustainably performing operations like construction or excavation. When preparing
for the extraction of resources on the coastline, organizations benefit from synthesizing
information like:
• Findings from 3D GIS mapping that suggest the likely outcomes of dredging material
in the water
• Lidar topographical surveys
• Data sets from past extraction activities
• Trends in coastal change
4.5.4. Wind Farm Assessment
Planning a wind farm requires a detailed analysis of an environment and the potential
effects of the structures. By using 3D GIS modeling, planners and other stakeholders can get a
better idea of the impact from wind farm development on wildlife and people.
4.16 GIS
For example, when assessing possible wind farm locations in two dimensions, a bird’s
migratory path might make a location seem inaccessible. However, reviewing that same space
using 3D GIS data may reveal that the elevation of birds’ flight paths and the height of the wind
farm are compatible.
In Switzerland, developers wanted to find a way to accurately determine the noise that
would be generated from the installation of a new wind farm. A team developing a visual-acoustic
simulation tool decided to study Mont Crosin in the Canton of Bern, Switzerland, which is home
to 16 wind turbines. The researchers analyzed recordings taken on days with varying wind and
weather conditions and wind speed measurements taken with a 3D ultrasonic anemometer. They
generated 3D models representing vegetation, infrastructure and the wind turbines themselves.
The data allowed planners to predict the noise and environmental impact that would be produced
by the proposed wind farm.