Higher Accuracy Geospatial Data is a Double-Edged Sword

February 10, 2012  - By
Image: GPS World

There’s no doubt that geospatial data collected today is more accurate than it was five years ago and will be more accurate five years from now than it is today. A couple of items had me thinking (once again) about the challenge that higher accuracy geospatial is posing and is going to pose in the future.

The first was an interview I did with Dale Lutz this week. Dale is the vice president of software development and co-founder of Safe Software. Dale is a great person to talk to about trends in geospatial data because Safe Software produces geospatial data conversion software tools. Essentially, the company’s software allows users to seamlessly merge geospatial data sets from different sources. For example, a user may have a requirement to merge data sets from AutoCAD, Esri, and Smallworld along with lidar data. Doing so manually can be a terribly laborious task. Not only does the user have to deal with different data formats, but also data of varying accuracy and unknown sources.

“One thing that is an ongoing issue, we see a lot of files that frankly don’t have the right coordinate systems in them or it’s missing, so then that relies on users to know,” said Lutz. “That kind of lack of metadata is going to pose a challenge for people as time goes on because folks aren’t going to remember and the file is going to get passed around. They are not going to know which datum it was collected with and they may not get exactly the correct answer.”

Dale succinctly summarizes the problem. After 20+ years in the geospatial industry, working in many places in the world, and teaching numerous workshops, matching spatial data is the #1 problem people ask me about. It’s fascinating to watch how diligent people are in acquiring the best data collection devices and collecting the most accurate data in the field, only to see it be diluted as it is integrated into a GIS or passed around without the metadata being communicated.

I’m guilty of it as much as anyone. On many mapping projects, I integrate data from several different data sources. Many times the data is a free download from the web with no metadata provided and no technical support. If I’m able to reach someone to ask a detailed question about the data, 90% of the time they will make their best guess as to the datum used and when the data was collected. Was it in the original NAD83 horizontal datum? HARN? NSRS 2007? And even, ugh, NAD27? The difference can be more than a meter or much greater. It doesn’t take much of an error to negate the value of the expensive high-precision GPS receiver you spent thousands of dollars to acquire.

Dale knows all too well. “When we used to deal with a MicroStation file that was accurate to a meter, we didn’t lose too much sleep, but now it’s more of an issue.”

Not only are horizontal datums an issue, vertical accuracy is a challenge of a different kind.

“It’s really doing a good job with the Z (elevation) that is the challenge we are working on. That’s been a big focus for us,” said Lutz.

Another item about geospatial data accuracy I ran into this week was a thread on an Autodesk discussion forum. It was an entertaining thread about parcel maps and how they don’t reconcile nicely.

The original poster summarizes the problem:

“I am trying to draw a parcel map in AutoCAD, using the distance and bearing info that was added by to the original hand-made drawing by the surveyor. The parcels don’t quite close perfectly… Does anyone know what the acceptable tolerances are for parcels of say 1 acre and under, 1-5 acres, and 5-20 acre sites? Will it ever close EXACTLY, or am I a dreamer?? WOuld you send the surveyor back out to take new measurements if, lets say, he was off by .3″? Or a foot? Or 4 feet on a huge parcel? I am new @ this and just getting started. Thanks!”

An obviously well-informed poster responded:

“That is one major open-ended question…
There are all kinds of things that come into play.  Some of it is the age of the original plat.  There are many places around our country where we have plats created in the 1700’s, using the proverbial “one-eyed goat and a rope”. Those surveys could have major errors, when compared to what we can achieve with today’s technology. But there’s a whole string of law that decides how all of that gets resolved, and it favors the “original survey” whenever possible. But above that, it favors any monuments that are found and recovered. Those typically hold precedence, even if they disagree with the legal record.
There are also standards that you may need to live to now, in our current age, especially if you’re doing something like an ALTA (Land Title) Survey.  You have to make sure to perform within the standards set by the law. With today’s technology, this is often relatively easy, but you still may run into issues when dealing with older neighborhoods, laid out in past times when measurements were not as exact, and especially when original monumentation can’t be found…  It can get worse; sometimes you find inconsistent monumentation, and have to try to sort through different surveys, figuring out which monuments were set when…  It can become quite a puzzle.
Learning all of this stuff is what becoming a professional land surveyor is about. And it takes years to do that. So there’s no real way to explain it all in a forum post.”
Finally, in one sentence the same poster summarizes the colliding worlds of digital cartography, one of the newest digital technologies, and land surveying, one of the oldest professions.
“A jig-saw puzzle made by blind men with dull saws. As I sometimes describe it.”
Thanks, and see you next week.
Follow me on Twitter at http://twitter.com/GPSGIS_Eric
This is posted in GSS Monthly, Lidar, Mapping