Your behavior appears to be a little unusual. Please verify that you are not a bot.


High-Level Perspective on PNT Frontiers

January 1, 2013  - By

New Technology, New Applications, New Science from the Stanford Symposium

LD-Litton

Headshot: James D. Litton

By James D. Litton

The sixth annual Stanford PNT Symposium in November brought together a select group of experts to share insights from the latest research, developments, and proposals, GNSS and non-GNSS, that show promise for the international community. Among other noteworthy presentations, we heard Brad Parkinson’s suggested incremental system changes to significantly improve signal availability and accuracy, a comprehensive update on China’s Compass system, and the latest in spoofing and proposed proofs of location.

GNSS in General

The budget realities of U.S. GNSS development, and the need to maintain the systems at the high levels of performance upon which so many critical and commercially beneficial applications now depend, were analyzed by two men with industry-household names, Brad Parkinson and Gaylord Green.

Nibbles. Professor Parkinson gave a very sophisticated, nuanced presentation entitled “Nibbles,” in which he outlined feasible and productive technical steps to ensure the preservation of what he described as “the three As:” availability, affordability, and accuracy. Rather than do radical surgery on accuracy or availability in order to preserve affordability, he identified so-called nibbles at requirements, incremental improvements enabled by use of current technology advances, for example, vector (Spilker) receivers, power-conversion efficiency improvements, antenna gain and steering modifications, weight reduction for multiple launch capability, and use of sensor fusion for more robust receivers with greater jam resistance.

It was a high-level but quantitative system design approach aimed at improving affordability and interference resistance while maintaining and improving availability and accuracy. He made the salient point that affordability with a given level of performance is enhanced by availability, that is, maintaining 30+ satellites on orbit brings multiple benefits that improve affordability. The estimates of gain from the nibbles struck me as conservative, at least for those with which I had some quantitative feel.

Alternative Architectures. Col. Gaylord Green addressed the same subject with a different approach, in a presentation entitled “GPS Alternative Architectures.” His motivation for alternative architectures was to provide the needed PNT capability at an affordable cost. He pointed out that GPS satellites have increased in dry weight from 334 to 2,100 pounds, and that the cost of the IIA, IIF, and III satellites have gone from $100 million on orbit to $400 million on orbit. Colonel Green indicated that starting a new development with the same signals cost more than continuing with GPSIII. (The Congressional Budget Office has recommended consideration of using IIF satellites to maintain the constellation and bypassing GPS III.)

The reduced capability satellites are called NavSats. He suggested that a mixed constellation of NavSats (with minimal ancillary payloads and frequencies) such as 15 GPSIII and 15 NavSats would enable a constellation of 30 satellites; the minimum necessary to assure sky-challenged users of satisfactory coverage. He recommended that design of satellite power conversion to be set by start-of-life, not end-of-life goals. Colonel Green identified the signal priorities in terms of their functions (L-5, L-2, L1C, and four military signals requiring crypto). Like Parkinson, he identified technology changes in antennas and signal architecture to reduce costs, necessitating a demonstration program. He also indicated that advantage could be taken of other GNSS constellations for civil signal purposes, alleviating the demands on GPS satellites. Colonel Green identified satellite constellation arrangements which would be more cost effective (multiple launch) and provide adequate coverage. He pointed out that such a NavSat program would require a new start and would necessarily constrain GPS modernization funding. In short, such a “GPS Alternative Architecture approach” would combine continuation of GPS III as planned with the addition of simpler, lighter satellites with reduced diversity of signals to replace the aging GPS satellites now on orbit beyond their design life.

Compass. Professor Jingnan Liu of the GNSS Research Center of Wuhan University gave what most observers thought was the first comprehensive and data-intensive description of Precise Positioning results with the COMPASS (Beidou) system. He showed that the Beidou regional system, from which he presented copious data, can currently provide standard positioning service with <10M horizontal and <20M vertical accuracies at 95% confidence level. He also showed that results with Beidou plus GPS are 10-20% better than GPS alone. He provided results for surveying, for ground-based augmentation, for RTK, PPP, clock stability, orbital statistics, wide area differential and many other metrics of PNT. Professor Parkinson noted, in appreciating the presentation, that it was the first detailed release of so much technical data on COMPASS performance. The results noted above were obtained with 4GEO+5 IGSO+2MEO satellites. The constellation is expected to grow to 5GEO+5IGSO+4MEOs by the end of 2012 and to 5GEOs+3IGSOs+27 MEOs by 2020 for a global service. The amount of data and the diversity (application and instrumentation) of the data were truly impressive.

GPS Modernization. Dr. Keoki Jackson of Lockheed Martin presented a comprehensive review of GPS Modernization with charts which described the evolution of GPS from Block I to Block III. He depicted the program as on schedule for delivery of the first GPS III vehicle in May, 2014, with a 2015 launch. Most of this material was the same as reported from the AFCEA GC-12 program in GPS World earlier this year. A matrix comparing the attributes of GPS III with GPSII and beneficial outcomes from “Back-to-Basics Investments” were key takeaways.

Ground Control. Ray Kolibaba of Raytheon presented a detailed overview of the OCX program, the next generation Operational Control System. This presentation also emphasized improvements in program management, simplification of development practices, extensive use of commercial development methods and predicted on-time delivery with all of the attributes needed for both GPS III and the existing constellation.

Military User Equipment. Col. Bernie Gruber, Director of the GPS Directorate, gave an update on current activities with emphasis on progress in Military User Equipment (MGUE) development. This material was somewhat further advanced in schedule than the equivalent May 2012 time frame in which the same subject was presented in much detail at the AFCEA GC-12 meeting at the Directorate. The currently ‘hot’ topics of jamming and spoofing threats, countermeasures and affordability were prominent in the presentation. Some of the key achievements for 2012 listed were the release of BAAs (Broad Agency Announcements) for NavSat studies and the completion of a Congressional Report on ‘Cost Effective GPS). Launch of GPS IIF-3 and delivery of GPS IIF-4, 5,6 & 7 were also noted. Security Certification for MUE cards was a very noteworthy achievement, which will make future MGUE development and utilization much easier for the challenging jamming and spoofing environment which is expected. The themes of affordability and jamming and spoofing threats were dominant in this review, as well.

General PNT

Norvald Kjerstad is a professor of Nautical Science at Aalesund University College and a long-time professional navigator in academic, geophysical, and shipping communities. His paper vividly depicted the risks brought about by climate change, by increased commercial interest in shipping and mineral resource exploration in the Arctic region, and by the very limited navigation infrastructure and limited communications assets.

Arctic Navigation. Both DGPS and SBAS systems are quite limited in the arctic, magnetic compass systems are less accurateat the very high latitudes ( and their errors propagate into navigation radar, collision avoidance and other systems). Auroral effects limit the availability of GNSS at times (Glonass improves GPS because of the higher orbital inclinations) and hydrographic charts of the arctic are frequently quite wrong, due to changes in water depth and to limited surveying frequency. Increased tourism, shipping and resource interest intensify the consequences of the increased risk to seafarers.

The advent of Galileo and Compass, integrated with GPS-Glonass will greatly improve the reliability of GNSS signals. However, navigation through the ice, at places thin and navigable and at random places deep and massive (ice ridges) is much more than knowing where one is with respect to the center of the earth. Radar helps with detection and avoidance of ice ridges but the sinking and grounding of icebreakers and commercial vessels demonstrate that much better knowledge of the environment is needed to avoid future disasters. The thousand-kilometer shorter route over the Pole can be very expensive and not necessarily the fastest one. However the increased activity in the Arctic is going to continue, and it is mandatory that safety factors be given greater attention by the International Maritime Organization (satellite compasses are reliable where magnetic ones are not, but the IMO has not approved them) and by the hydrographic services of the affected areas.

From Farm to Front Office. Jim Geringer, former governor of Wyoming, now a director of ESRI and a member of the GPS Excom gave, as usual, a very entertaining presentation (“GPS/GNSS From the Farm to the Front Office”) with highly interesting examples of the very broad and deep impact of GNSS on society, including financial statistics and object lessons in the misuse or inaccurate use of geospatial data. Geringer was an engineer before he went into politics and that came through clearly in the presentations, even though he was very self-effacing concerning his technical credentials. He gave amusing examples, not all from Apple, of the effects of combining current and historical geospatial data, such as airport runways shown in topography layers obtained before leveling the airport areas, and a road running across the valley filled by Hoover Dam.

Geringer critiqued an attitude on the part of GNSS professionals in which their attention is more devoted to the how of obtaining the information than to the effects that future changes might have on the users. He discussed policy challenges presented by the FCC mandate to find 500MHz of spectrum for high speed wireless data, by affordability, by the potential for jamming and spoofing. It was good to be reminded of the awesome realized economic benefits of GNSS, the manifold applications which GNSS systems enable and the ease with which this potential can be limited or actually damaged by pursuit of other worthwhile objectives which are politically favored or which bring short term revenue into the treasury at the expense of GNSS system requirements in bandwidth. The less obvious but equally or more beneficial economic benefits of high accuracy GNSS and the impact of actual lives lost or resources untappedwere illustratedand quantified in Geringer’s broad presentation. One hopes that this presentation will be or has been seen at High GSA and policy levels in the FCC and NTIA.

Geringer’s presentation provides a nice segue into a presentation by:

LightSquared Lessons Learned. Rich Lee of Greenwood Telecommunications Consultants, LLC and iPosi.  Entitled Lessons Learned from the GPS-LightSquared Proceeding, it was an assessment of the opportunities missed and damage done in the drive to enable the use of spectrum adjacent to GNSS frequencies for 4G LTE wholesale services through high power Auxiliary Terrestrial Components (ATCs) using MSS spectrum reallocated (or repurposed) to the purpose under a conditional waiver by the Chairman of the FCC, Julius Genachowski, on a recommendation by the International Bureau of the FCC. According to Lee, Greenwood was called in to solve, “if solutions exist” the problem of the ‘spectrum collision’ between the LSQ design and GPS, after the collision occurred. He likened the role of Greenwood to that of a tow truck operator called in to clear up a collision after the impacts. Lee served on the TWG (Temporary Working Group) as head of the cellular subgroup and headed the NTIA/Excom cellular tests. The presentation was very good, technically, in both its detailed and more strategic aspects but both the history described and the lessons learned (see below) were, understandably, from the perspective of a party which was unable, in this particular instance, to achieve the goals desired by their sponsors. This failure was for reasons of basic spectrum policy conflicts between GNSS applications and those mooted to become transcendent- mobile high speed data for consumer and industrial applications.

Lee depicted the lack of a requirement in history for regulation of receiver standards, as opposed to transmitter standards, to the inability to anticipate the crowded spectrum (for example, his statement that spectrum was regarded as “free” and minimizing interference was the key objective, a burden placed on the transmitters). Now that spectrum is seen as scarce and underutilized in many U.S. government applications and inadequately conserved in many civil applications, the concept of receiver standards for avoiding interference and the use of advanced filterand antenna technology in receivers as well as in transmitterswould enable easier, less confrontational and more lucrative use of this 21st century El Dorado.

Parenthetically, Pierre de Vries (University of Colorado, and a member of the FCC’s Technical Advisory Committee) and others recently testified to a House of Representatives panel, recommending that harm claim thresholds be established with which to manage the trade-offs between intrinsic receiver protection requirements and transmitter power distribution, so that instead of just adding the specification requirement to receivers, a flexible system approach be adopted. They noted that it was very difficult to anticipate the receiver design needs for all applications. The failure to understand the requirements of precision GNSS receivers and the simplistic concept of fences was a large driver in the collision between LightSquared and GNSS.

Lee’s lessons learned summary is:

  • Upper 10: candidate for ground augmentation? The upper 10 MHz (1545-1555 MHz) of spectrum was originally allocated to LightSquared through its acquisition of TerraSat. During the 2012 conflict months, LightSquared publicly abandoned operating in the Upper 10.
  • Question: sound alternatives for this band? (Including as a good GNSS guard band)
  • Consider: sub-microwatt uses for short range augmentation, such as Department of Transportation Intelligent Transport Systems (ITS)-TWG findings. Given very low effective isotropically radiated power (EIRP), ample compatibility with precision GPS nearby.
  • Precision GPS: –82 dBm worst case Upper 10 susceptibility (–1 dB C/NO)
  • 1 uW EIRP transmitter is about 13 dB below at 1 meter
  • Seems suitable for high availability in urban areas; provides urban in-fill, redundancy such as ITS
  • At 100-mETER range: Signals ~-135 dBm incident power at an ITS receiver antenna
  • Band continues as a space-to-earth downlink, shared with geostationary Earth orbit-mobile satellite services, including carriage of GPS/GNSS corrections (OmniSTAR, StarFire)

Lee contested the FCC chairman’s assertion that the LightSquared-GPS matter was an anomaly, saying instead that it was “foreseeable.”
However, foreseeable anomalies such as singularities exist in predictions of scientists. I believe that this anomaly was clearly foreseeable, but a hedge-fund mentality, financial engineering, and a long-held attitude toward GPS in the FCC were the drivers of these benighted decisions.

The gold rush is still on for finding underutilized spectrum. Some systems, including GNSS, utilize bandwidth that needs protection for purposes other than the usual communications requirements. It is vital to honor the homesteads of GNSS and protect the noise floors. Receiver standards must be considered very carefully because communications receivers and high precision GNSS receivers are very different systems.

Scientific Subjects

Some presentations grouped under this topic are available in ION publications from GNSS 2012.

Atom Interferometry. Mark Kasevich of Stanford presented his paper on precision navigation sensors based upon atom interferometry. While application of these sensors in general awaits many highly difficult engineering advancements, the outcome would be a great boon to navigation, were the outcome comparable to the evolution of chip-scale atomic clocks.

Andrei Shkel reprised his paper entitled “Precision Navigation, Timing, and Targeting enabled by Microtechnology: Are we there yet?”

Gravity. Tom Murphy of the University of California, San Diego, gave a fascinating paper of fundamental importance to understanding gravity by laser ranging to retroreflectors left on the moon by various Apollo and Russian missions. A highly contrived initialism for the project is APOLLO, for Apache Point Laser Observatory Lunar Laser-Ranging Operation. The work is a product of a seven-university/research center consortium.

The system of APOLLO for measuring the range of the moon relative to the earth at Apache Point is a marvel of experimental ingenuity and advanced instrumentation in collecting the few photons that get back from the laser shots at the moon. Laser light is caught by the retroreflectors and returned to the telescope at Apache Point. A very sensitive gravimeter system at the observatory enables compensation for the Earth’s crustal motions, and orbital deviations are compensated. Precisions of a few millimeters in range to these devices on the moon are achieved, almost good enough to be useful in testing the “Strong” Equivalence Principle of General Relativity.

From an engineering point of view, the timing, motion compensation, detection sensitivity (a few photons per shot), and several other features of the system are truly impressive, and the potential for improving our understanding of general relativity, so-called dark matter or energy, and more, are exciting aspects of this work. To have much better precision through placing laser transceivers on the moon to increase the number of reflected/transponder photons in the samples would appear to be quite valuable and relatively simple NASA missions for future work, even though the data may eventually be sufficient to enable theoretical advancements without such added signal-to-noise benefit. This paper was an example of excellent engineering in the service of important science.

Vulnerabilities and Limitations

Charles Schue of UrsaNav gave a very detailed and comprehensive paper on wide-area timing, navigation, and data using low-frequency technology. He provided data for timing, location, and data transmission over distances greater than 125 nautical Mmiles.

eLoran. He made the point and showed examples to demonstrate that the technology for these systems exists today, is highly affordable, and can represent a major strengthening of the nation’s critical infrastructure. The systems and hardware he presented are very attractive and seemingly very mature.
Schue was preaching to the choir, as far as I can tell; there is, in the PNT community, no controversy about the need for eLoran. Further, there is a sense of disappointment and wonder that so little money was saved at the expense of great risk to our critical PNT infrastructure, particularly in view of the vulnerability to jamming and spoofing of GPS and the other GNSS systems for civil use; a vulnerability analysis which informed the balance (two) of the papers in this summary report.

Spoofing. Dennis Akos presented data on spoofing tests conducted at Lulea, Sweden, near a low-density commercial airport with limited road traffic and a restricted Swedish Air Force weapons test area, and in Kaohsiung, Taiwan, near a very busy airport with dense roadway traffic. The incidence of radio-frequency interference (RFI) in the latter case was great and in the former case negligible, until the team introduced their jamming and spoofing equipment.In both cases, a simple automatic gain control (AGC) monitoring design, which was computationally efficient, was able to detect and measure the RFI from the jammer-spoofer.

Using all commercial off-the-shelf (COTS) hardware, the jammer was identified and located with time-of-arrival and power-difference-of-arrival. The researchers showed that using a controlled reception pattern antenna (CRPA) like the Stanford four-element CRPA and all-COTS equipment, jammers could be indentified and located efficiently through AGC processing. A large amount of detailed data were presented with screen shots and plots of the effects of the jamming on the receivers.

Proof of Location. Logan Scott of LS Consulting gave a paper on proof of location. He projected the need for location proof in several applications, ranging from system control and data acquisition intrusions that would affect industrial control systems to bogus Mayday calls, the response to which is very expensive, and he provided many examples of data security applications. He also provided several schemes, ranging from cryptographic GPS RF signal structures to the use of overlapping systems, like Galileo and GPS, to enable verification of location.

Scott identified the massive security threat represented by millions of smart phone and tablet users who can store millions of bytes of information, such as maps of sensitive locations. An authorized user of such a map, GNSS-enabled, on a tablet or smart phone, should be able to access the restricted information if the user is in the right location. However, a user, authorized or not, outside of the restricted area would find that area of the map blank if he tries to access it externally, a kind of location need-to-know control.

Scott anticipates the use of temporary keys for weapons usage; such keys would require that the user be in a location authorized for such use. He provides block diagram descriptions of systems that would be feasible to achieve these location proofs for high-value and dangerous operations. These block-diagram level descriptions are accompanied by quantitative assessments of the difficulties and benefits of such system modifications.

It was a compelling tour de force on the subject. We do not have time or space to cover it well but the material has gradually been built up from earlier available publications by Scott at ION conferences and in GNSS journals and magazines. Both the need for such systems and the means by which they may be practically achieved are well worth studying by those responsible for policy and programmatic decisions, and by technologists seeking new product ideas and applications.

And More

A few interesting presentations do not fit into the above categories. Stan Honey, founder of the company Sportvision (the creator of the first-down yellow-line overlay in televised American football, and many other broadcast enhancements for sporting events) and considered sailing’s master navigator, gave a wonderful dinner talk about the PNT technology being utilized in the America’s Cup TV graphics, umpiring, and race management. Honey reflected upon how competitive sailing, unlike other professional sports, has fully adopted the use of advanced PNT technology in how the sport is umpired and managed.

Jason Wither of Microsoft presented a paper on spatialized data for mixed reality, which was very informative in how various types and layers of data are combined to create mixed-reality systems.

Ron Fugelseth of Oxygen productions showed his very entertaining video entitled “A Toy Train in Space.” The video was posted on YouTube a few months ago and immediately went viral. It is a fine example of the use of GPS technology.


James D. Litton heads the Litton Consulting Group and previously played key executive roles at NavCom Technology and Magnavox.