Thursday, April 4, 2019
The Map Generalization Capabilities Of Arcgis Information Technology Essay
The symbolise Generalization Capabilities Of Arcgis teaching Technology low(a)takeselective in instituteation branching associated with Geographical In modelation Systems is so enormous. The learning needed from this data actu entirelyy varies for variant applications. Specific expand cigarette be extracted, for instance resolution diminished, contours reduced, data redundancy eliminated or propertys on a chromosome mapping for which application is needed absorbed. This is in all aimed at reducing storage space and incorporateing details on a map with a doubler scale accurately unto a nonher with a such(prenominal) atrophieder scale. This paper presents a framework for the Map Generalization instruments embedded in ArcGIS (A Geographical Information Systems package by ESRI) as well as the algorithmic rule each beak personas. Finally, a go over of all the tools indicating which is much efficient after thorough analysis of the algorithm used and the desired yie ld result produced.1.0 Introduction1.1 Definition of Map GeneralizationAs (Goodchild, 1991) points out, Map Generalization is the ability to alter and show spatial romps with location attached to them relationships as it is enamorn on the earths surface modelled into a map. The advantages involved in adopting this process cannot be overemphasized. Some be itemized below (Lima dAlge J.C., 1998)It reduces complexity and the rigours Manual Cartographic Generalization goes through.It conveys information accurately.It conserve the spatial accuracy as drawn from the earths surface when modellingA lot of Softw be vendors came up with solutions to tackle the problem of manual cartography and this report result be reflecting on ArcGIS 9.3 Map Generalization tools.1.2 Reasons for Automated Map GeneralizationIn times past, to achieve this level of precision, the service of a skilled cartographer is needed. He is faced with the task of modelling representation of features on the earths su rface on a bulky scale map into a smaller scale map. This form of manual cartography is very operose because it consumes a lot of time and also a lot of expertise is needed due to the occurrence that the cartographer will inevitably draw all the features and represent them in a smaller form and also interpreted into consideration the level of precision required so as not to see the data/graphical representation invalid.The setbacks experienced were the motivating factor for the advent or introduction to self-locking Cartographic Design which is k todayn as Automated Map Generalization. A crucial part of map induction is information abstr exercise and not necessarily to compress data. Good generalization technique should be intelligent which takes into consideration the characteristics of the physical body and not just the ideal geometric properties (Tinghua, 2004). Several algorithms set of instructions taken to achieve a programming result learn been developed to enable t his and this report is critically going to explore each of them1.3 put to work of Automated Map GeneralizationAs Brassel and Weibel (n.d.) Map Generalization can be grouped into five steps.Structure RecognitionProcess RecognitionProcess ModellingProcess exertiondisplayThe step that will be elaborated upon for the cause of this report will be Process Recognition types of Generalization procedures which involves different manipulation on geometry in order to simplify the cast and represent it on a smaller scale (Shea and McMaster, 1989)2.0 Generalization Tools in ArcGIS 9.32.1 soundless PolygonThis is a tool used for cartographic design in ArcGIS 9.3. It involves dividing the polygon into several vertices and each vertice being smoothed when the action is performed (FreePatentOn cable length, 2004-2010). An experiment is illustrated below to show how Smooth Polygon works.Add the layerfile Polygon which has an attri besidese name of Huntingdonshire-which is a district packed fr om England_dt_2001 ambit shapefile that was downloaded from UKBorders. The next step was I selected the ArcTool Box on the standard toolbar of ArcMap, then I went to Generalization Tools which is under entropy Management Tools and afterwards I clicked on Smooth Polygon. Open Smooth Polygon Select Input feature (which is polygon to be smoothed) in this case Polygon select the replete(p)ning feature class (which is file location w present the produce project is to be saved) select the step-down algorithm (which is PAEK) select the simplification valuation reserve. human body 2.0 Display before Smooth Polygon name 2.1 Display after Smooth PolygonThe table in flesh 2.1 shows the output when Polynomial Approximation exponential Kernel (Bodansky, et al, 2002) was used. The other algorithm that can be apply for this procedure is Bezier Interpolation.Algorithm role step-down Tolerance(Km) duration Taken (secs)PAEK41Bezier Interpolation112 ruminationPAEK Algorithm When thi s technique was used, as the simplification adjustment note value is change magnitude, the weight of each point in the image decreased and the more the image is smoothed. Also, the output curves generated do not pass through the enter assembly line vertices merely, the endpoints argon retained. A material short coming of PAEK Algorithm is that in a bid to smoothen some rough edges, it eliminates burning(prenominal) boundaries, to refrain from such occurrence a buffer is to be applied to a zone of definite width before allowing the PAEK Smooth algorithm to execute. (Amelinckx, 2007)Bezier Interpolation This is the other algorithm that can be applied to achieve Smoothing technique on polygons. In this case, the parameters are the resembling as PAEKs except that the tolerance value is greyed out- no value is to be inputed and as a result the output image produced is identical to its parentage because the tolerance value is responsible for smoothen rough edges and the exalte der value stated, the more the polygon is smoothed. The output curves passes through the input line vertices. When this experiment was performed, it was noticed that its curves were properly aligned almost vertices.Conclusion later performing twain experiments, it was observed that the PAEK Algorithm is better because it allows a tolerance value to be inputted which in turn gives you a more smoothed image roughly curves and this will be of more vastness to cartographers that command to smoothen their image and reverse redundant points.2.2 Smooth inceptionThis is the second tool we will be examining. This is similar to Smooth Polygon technique except that the input feature will give to be a polyline shapefile. The steps are repeated as illustrated in Smooth Polygon unless under Generalization Tools Smooth Line is chosen. Now under input feature (select gower1) which is a dataset provided for use on this report. Specify the output feature smoothing algorithm selected (PAEK) smoothing tolerance.Note All other fields are left as inattentions i.e. No_check/Flag Error meaning we do not essential it to display all errors if encountered and fixed_Endpoint/Not_fixed which preserves the endpoint of a polygon or line and applies to PAEK Algorithm.Algorithm TypeSimplification Tolerance(Km)Time Taken (secs)PAEK10002Bezier Interpolation4Fig 2.2 Display after Smooth Line technique was applied__________ (Before Smoothing Line)__________ (After Smoothing Line) commentPAEK Algorithm The tolerance value used here was so high to be able to physically see the changes made. PAEK Algorithm as applied on gower1 smoothed the curves around edges and eliminates unimportant points around the edges. This results in an image with fewer points as the tolerance value is increased. The output line does not pass through the input line vertices. This algorithm uses a syntax where the average of all the points is taken and for a particular vertex, which is substituted with the a verage coordinates of the next vertex. This is done sequentially for each vertex but displacement of the shape is averted by giving priority to the weighting of the central point than that of its neighbouring vertex.Bezier Interpolation full like in Smoothing Polygon, a tolerance value is not required and when this technique was performed in this illustration, points around edges were partially retained resulting in drawing smooth curves around the vertices. The output line passes across the input line vertices.Conclusion From both illustrations just as in Smooth Polygon, PAEK Algorithm was considered most impelling because it generates smoother curves around the edges as the tolerance value is increased. However, the true shape of the image can be gradually lost as this value is increased but with Bezier Interpolation curves around the vertices are maintain but just smoothed and vertices maintained to as well. alter PolygonThis method is aimed at removing gluey bends around ver tices while preserving its shape. There are ii algorithms involved Point engage and change shape modify.The shapefile used for this illustration is the polygon (Huntingdonshire) district of England. Select simplify Polygon (under generalization tools, which is under Data Management tools then input feature as polygon output feature simplification algorithm smoothing tolerance.Algorithm TypeSimplification Tolerance(Km)Time Taken (secs)Point nullify24Bend modify29Fig 2.3 Display before simplify Polygon Fig 2.4 Display after Simplify PolygonPoint Remove Algorithm This is a metamorphosis of the Douglas-Peucker algorithm and it applies the area/perimeter quotient which was outgrowth used in Wang algorithm (Wang, 1999, cited in ESRI, 2007). From the above experiment, as the tolerance value is increased, more vertices in the polygon were eliminated. This technique simplifies the polygon by reducing lots of vertices and by so doing it loses the original shape as the tolerance value is increased gradually.Bend Simplify Algorithm This algorithm was pioneered by Wang and muser and it is aimed at simplifying shapes through detections around bent surfaces. It does this by eliminating insignificant vertices and the resultant output has better geometry conservation.Observation After applying both algorithms to the polygon above, it was seen that for point abrogate, the vertices reduced dramatically as the tolerance value was increased in multiples of 2km. This amounts to about 95% reduction while when the same approach was applied to Bend Simplify there was about 30% reduction in the subjugate of vertices. Bend Simplify also took interminable time to execute.Conclusion It is seen that Bend Simplify is a better option when geometry is to be preserved however when the shape is to be represented on a smaller scale, point remove will be ideal because the shape is reduced significantly thereby appearing as a shrink image of its original.Simplify LineThis is a simila r procedure to Simplify Polygon except that here the shapefile to be considered is a line or a polygon which contains intersected lines. It is a process that involves reduction in the number of vertices that represent a line feature. This is achieved by reducing the number of vertices, preserving those that are more relevant and strike those that are redundant such as repeated curves or area partitions without disrupting its original shape (Alves et al, 2010). both layers are generated when this technique is performed a line feature class and a point feature class. The author contains the simplified line while the latter contains vertices that check been simplified they can no longer be seen as a line but instead collapsed as a point. This applies to Simplify Polygon too. However, for both exercises no vertex was collapsed to a point feature.To illustrate this, the process is repeated in previous generalization technique, but under Data Management tools select simplify line se lect input feature (gower1) select output feature select the algorithm (point remove) tolerance. Then accept all other defaults because we are not interested in the errors.Algorithm TypeSimplification Tolerance(Km)Time Taken (secs)Point Remove87Bend Simplify812Fig 2.5 Display after Simplify Line__________ (Before Simplifying Line)__________ (After Simplifying Line)Two algorithms are necessary for performing this operation Point Remove and Bend Simplify.ObservationPoint Remove Algorithm This method has been enumerated in Simplify Polygon. It is observed here that when point remove algorithm was used the lines in gower1 were redrawn such that vertices that occurred redundantly were removed and this became even more evident as the tolerance value increased such that the line had sharp angles around curves and its initial geometry is gradually lost.Bend Simplify Algorithm This also reduces the number of vertices in a line and the more the tolerance value was increased, the more the n umber of reduction in the vertices. It takes a longer time to execute than the Point Remove. However the originality of the line feature is preserved.Conclusion From the two practical exercises, Bend Simplify algorithm is more accurate because it preserves the line feature and its original shape is not too distorted. However, if the feature is to be represented on a much smaller scale and data compression is the factor considered here, then Point Remove will be an option to embrace.Aggregate Polygon This process involves amalgamating polygons of neighbouring boundaries. It merges separate polygons (both distinct ones and adjacent) and a stark naked perimeter area is obtained which maintains the surface area of all the encompassing polygons that were merged together.To illustrate this, select Data Management Tools select aggregate polygons select input feature (which is a selection of several districts from the England_dt_2001 area shapefile I downloaded) output feature class a ccrual distance (boundary distance between polygons) and then I left other values as default.Fig 2.6 Display before Aggregate Polygon Fig 2.7 Display after Aggregate PolygonAggregation Distance Used 2kmTime Taken 48secsAs seen from both figures, the districts in Fig 2.6 were joined together as seen in fig 2.3. As the aggregation distance is increased further, the separate districts are over-merged and the resultant image appears like a plain wide surface area till those hollow parts seen in fig 2.7 disappears. The algorithm used here which is inbuilt into the arcgis software is the Sort Tile Recursive tree. This algorithm computes all the nodes of neighbouring polygons by implementing the middle cross(prenominal) method in a logical sequence from left to right. When this computation is complete, the result is stored as a referenced node. Now the middle transversal node in the tree is obtained and thereafter a mergence is metric which spans from the left node to the right node un til it get to the root of the tree (Xie, 2010)2.6 Simplify Building This process simplifies polygon shapes in form of buildings with the aim of preserving its original structure. To illustrate this, Simplify Building is chosen under Data Management tools. The appropriate fields are chosen input feature here is a building shape file I extracted from MasterMap download of area code CF37 1TW.a b c dFig 2.8 Display before Simplify Building Fig 2.9 Display after Simplify BuildingAs shown above, the buildings in (a and b) in fig 2.8 were simplified to (c and d) in fig 2.9 where a tolerance value of 10km was used and the time taken to execute this task was 3secs. As the tolerance value is increased, the more simplified the building is and it loses its shape. The algorithm behind this scene is the recursive approach which was first implemented with C++ programming language but has evolved into DLL (Dynamic Link Library) applications like ArcGIS 9.3The recursive approach algorithm follows th is sequence of steps. ascertain the angle of rotation of the building, computing nodes around a boundary and then enclosing a small rectangular area which contains a set of pointsThe angle of rotation is setDetermining the vertices around edges as regards the recursion used and thereafter to calculate the splitting rate and a recursive decomposition of the edge with compliments to those of the new edges.The shortcoming of this algorithm is that L and Z shaped buildings are culprits as they give irrational shapes while it works perfectly on U and L shaped buildings (Bayer, 2009).2.7 Eliminate This technique essentially works on an input layer with a selection which can either take the form of Select by Location or Select by Attribute query. The resultant image now chunks off the selection and the remaining composites of the layerfile are now drawn out. To illustrate this, eliminate is chosen under data management tools, the input feature here is England_dt_2001 area shapefile w hich has some districts selected and the output feature is specified, all other fields left as defaults.From Fig 3.0 after eliminated procedure was taken on the polygon (the green highlights being the selected features), the resultant polygon is shown in Fig 3.1. However the districts in Fig 3.1 now excludes all those selected in Fig 3.0 and this can be seen visually in labels a and b and and so Fig 3.1 has fewer districts.a bFig 3.0 Display before Eliminate process Fig 3.1 Display after Eliminate processThe time taken for this procedure was 44secs.2.8 conclude The dissolve tool works similarly to the aggregate polygon except that in dissolve, it is the features of the polygons that are to be aggregated and not the separate polygons themselves. The features are merged together using different statistic types more like an alias performed on them.To illustrate this, click on Dissolve under Data Management tool, select input features- same used for aggregate polygons (features to be aggregated) the output field (where the result is to be saved) the dissolve field (fields you want to aggregate together) statistic type multi_part dissolve_lines. The diagram below shows thisObservation For this exercise, the dissolve field was left as default meaning no field was selected. Also, multi_part was used which denotes that instead of merging smaller fields into a large one-the features becomes so extensive that if this is displayed on a map, there can be loss of performance however the multi_part option makes sure bigger features are split into separate smaller ones. Dissolve_line field makes sure lines are dissolved into one feature while unsplit_lines only dissolve lines when two lines have an end node in common. The algorithm for this technique is simply Boolean (like a true or false situation, yes or no). However there are shortcomings with this technique as low virtual memory of the computer can limit the features that are to be dissolved. However, input featu res can be dissected into parts by an algorithm called adaptive tiling.Fig 3.2 Display before Dissolve process Fig 3.3 Display after Dissolve processTime taken = 10secs2.9 Collapse forked Lines This is useful when centric lines are to be generated among two or more parallel lines with a specific width. This can be very useful when you have to consider large road networks in a block or casing. It enables you to visualize them properly. To illustrate this, clean-cut Collapse Dual Lines under data management tools select input feature (which is gower1) select the output feature select maximal widthMaximum width (this is the maximum width of the casing allowed that contains the feature to be collapsed e.g. width of a road network) while the minimum width is the minimum value allowed to be able to denote its centric line from.In this exercise, maximum width = 4kmTime taken = 4secsFig 3.4 Display after Collapse Dual Line to Centerline__________ (Before Collapse Dual Line)__________ (After Collapse Dual Line)As seen above, it is observed that when this experiment was performed, those lines in blue are aftermaths of effect of procedure of operation on them because they had a red colour before. However those in red did not change because they did not have a width within the specified maximum width stated. However, this is going to change as the maximum width is increased or a minimum width is set.3.0 ConclusionFrom the illustrations shown in this paper, we can see that various forms of generalization tools have their various purposes either in form of shape retention, angular preservation or simply reduction purposes so that a replica image shown on a larger scale can fit in properly on a smaller scale. However depending on the tool chosen, a compromise will have to made on these factors giving preference to what it is we want to be represented after performing the operation. Different algorithms were explored and it is inferred that when polygons or lines are to be simplified, point remove is accurate option when you want to represent them on a smaller scale, however if originality of shape is to be considered then bend simplify algorithm will work best while for Smooth technique on polygons and lines, PAEK Algorithm is better.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.