Higher Accuracy Geospatial Data is a Double-Edged Sword
There’s no doubt that geospatial data collected today is more accurate than it was five years ago and will be more accurate five years from now than it is today. A couple of items had me thinking (once again) about the challenge that higher accuracy geospatial is posing and is going to pose in the future.
The first was an interview I did with Dale Lutz this week. Dale is the vice president of software development and co-founder of Safe Software. Dale is a great person to talk to about trends in geospatial data because Safe Software produces geospatial data conversion software tools. Essentially, the company’s software allows users to seamlessly merge geospatial data sets from different sources. For example, a user may have a requirement to merge data sets from AutoCAD, Esri, and Smallworld along with lidar data. Doing so manually can be a terribly laborious task. Not only does the user have to deal with different data formats, but also data of varying accuracy and unknown sources.
“One thing that is an ongoing issue, we see a lot of files that frankly don’t have the right coordinate systems in them or it’s missing, so then that relies on users to know,” said Lutz. “That kind of lack of metadata is going to pose a challenge for people as time goes on because folks aren’t going to remember and the file is going to get passed around. They are not going to know which datum it was collected with and they may not get exactly the correct answer.”
Dale succinctly summarizes the problem. After 20+ years in the geospatial industry, working in many places in the world, and teaching numerous workshops, matching spatial data is the #1 problem people ask me about. It’s fascinating to watch how diligent people are in acquiring the best data collection devices and collecting the most accurate data in the field, only to see it be diluted as it is integrated into a GIS or passed around without the metadata being communicated.
I’m guilty of it as much as anyone. On many mapping projects, I integrate data from several different data sources. Many times the data is a free download from the web with no metadata provided and no technical support. If I’m able to reach someone to ask a detailed question about the data, 90% of the time they will make their best guess as to the datum used and when the data was collected. Was it in the original NAD83 horizontal datum? HARN? NSRS 2007? And even, ugh, NAD27? The difference can be more than a meter or much greater. It doesn’t take much of an error to negate the value of the expensive high-precision GPS receiver you spent thousands of dollars to acquire.
Dale knows all too well. “When we used to deal with a MicroStation file that was accurate to a meter, we didn’t lose too much sleep, but now it’s more of an issue.”
Not only are horizontal datums an issue, vertical accuracy is a challenge of a different kind.
“It’s really doing a good job with the Z (elevation) that is the challenge we are working on. That’s been a big focus for us,” said Lutz.
Another item about geospatial data accuracy I ran into this week was a thread on an Autodesk discussion forum. It was an entertaining thread about parcel maps and how they don’t reconcile nicely.
The original poster summarizes the problem:
“I am trying to draw a parcel map in AutoCAD, using the distance and bearing info that was added by to the original hand-made drawing by the surveyor. The parcels don’t quite close perfectly… Does anyone know what the acceptable tolerances are for parcels of say 1 acre and under, 1-5 acres, and 5-20 acre sites? Will it ever close EXACTLY, or am I a dreamer?? WOuld you send the surveyor back out to take new measurements if, lets say, he was off by .3″? Or a foot? Or 4 feet on a huge parcel? I am new @ this and just getting started. Thanks!”
An obviously well-informed poster responded:
Follow Us