Receiver Design for the Future
How the Internet of Things Now Drives Location Technology
The number of devices connecting to the Internet is growing fast. The applications running on them require location context to determine the most likely use case. These devices need continuous location — not necessarily noticed or activated by the user, but always on. The specification that becomes important is energy per day: the device must maintain its location without draining its battery — and increase location availability indoors. That creates new design requirements for hybrid capability.
By Greg Turetzky
A lot of people have the opinion that the GNSS market is kind of flat. Actually, several different market studies would indicate that it’s not as flat as you would think. See FIGURE 2, taken from the European GNSS Agency’s (GSA’s) 2015 GNSS Market Report. The growth rate certainly is slowing, but any market that continues to grow at a 9 percent annual growth rate is a very nice target area. As you can see, the GSA expects that we’re going to have somewhere in the neighborhood of 7 billion devices within the next eight to ten years.
We’re getting to the point where the number of GNSS receivers exceeds the population of the planet, which makes for an interesting thought process as to where GNSS is going to end up, and how it’s going to have to end up in everything that we do. That makes for a nice market opportunity. A big reason for that is we’ve seen a lot of growth in demand for multi-constellation GNSS. Everything pretty much has GPS in it that everyone terms as GNSS, but the growth of these other constellations is happening relatively quickly.
FIGURE 3, in my opinion, is already significantly out of date, even though it is less than a year old. Other market estimates indicate that GLONASS penetration into receivers, especially in the mobile phone field, is closer to 70 or 80 percent today, and that is expected to grow. There’s really no technical or economic reason why GNSS receivers can’t support multiple constellations, even at the consumer mobile device level.
Once all those constellations are in place, let’s look at where those receivers are going from a market standpoint. FIGURE 4 is divided by revenue, which is an interesting way to do it because we all know if you divided it by actual units, then the location-based services (LBS) portions in phones would dominate everything; everything else would just be a sliver that wouldn’t be visible. But if you look at it from a revenue standpoint, there are still many revenue opportunities in the phone segment and in the automotive segment.
Another reason to expect continued market growth is, if you examine Figure 4, you’ll notice that the Internet of Things (IoT) category (see SIDEBAR) doesn’t even show up here. We’ll see going forward that there will be a new slice of pie showing a focus on that segment and those types of applications.
Intel and the Internet of Things
Intel’s mission is no longer only to build PCs. We’re about bringing smart, connected devices to everyone. That encompasses a range of products, and we’ve been expanding our portfolio appropriately.
We start with everything from big iron data centers (which are part of smart devices) to mobile clients and all the way down to the Internet of Things (IoT) and wearable devices. All those devices are part of this smart connected world. Our group’s job is to help on the connectivity side, which varies by product.
This whole idea expands beyond mobile phones and into the IoT, a big trend whose methodology is transforming business, starting at sensors all the way up to big data, to make interesting decisions. The number of devices that are being able to connect to the Internet is growing faster than anybody can keep up with, and that creates a really interesting opportunity. That gives you a bit of a picture as to why Intel is interested in this market and where you’re going to see us playing.
Looking at how we provide this location capability beyond just GNSS, how are people determining their location in these different platforms, and what are the different technologies available? FIGURE 5 shows that in 2014–2015 the most popular technology is still GPS, but there is a fast-growing trend in both Bluetooth-enabled and Wi-Fi-enabled penetration of location technology. Both of these are more suited to indoor operation, where the market is still in its early stages.
Although GNSS continues to grow with market growth, the growth of other technologies and the ability to incorporate them into location solutions is growing pretty quickly, and the radio versions of those are, in general, growing the fastest, followed by the inertial sensors. I think we’re going to see this combination of location technologies, jointly providing a single answer, becoming the norm in mobile products.
These technologies are going to end up, especially for indoors, in different areas. FIGURE 6 shows a huge growth, not only growth but segmentation among a bunch of different types of venues, all of which seem to be adopting an indoor location methodology. Not all of them will adopt the same one, but all these types of venues are looking at that market and are looking at potential different technologies to serve their needs. What might be most appropriate in a grocery store — geared towards finding a particular item — like a Bluetooth beacon might be less interesting in an airport, where there’s still a need for navigation from place to place, where proximity is not necessarily the right answer.
We see a large growth of a very disparate technology base; at the right of the figure is a pie chart where I had to remove all the callouts, the list of all the different technology suppliers addressing these particular indoor markets. What you see is a highly fragmented supplier base; that’s very consistent with an early market implementation. There’s a lot of different people attempting to get into this market with a lot of different solutions. This is pretty classic for an early-adopter scenario.
The Stack. Changing accuracy requirements will come up a bit later in this article. Once we’ve looked at where those different venues are from a requirements standpoint, we start to look at the types of companies that are trying to participate in the ecosystem required to do that (FIGURE 7). If you start from the bottom, where I live as a chipset manufacturer, and you move up the chain, you see seven different layers of people in the creation of a location to the end user, especially indoors. And every single person you see in this value chain is trying to make money.
That’s the crux of the issue: a lot of people want a piece of that pie, and all of them have a relevant part to play, but when seven people in the stack are all trying to own the location result in order to monetize it, it becomes difficult to create a unified methodology. I live at the bottom of this complex ecosystem, in the technology implementation layer. Getting dollars to flow from the top to the bottom gets relatively difficult, so we are very driven to bring cost competitiveness into this market.
In summary, from a market standpoint, we see that the market opportunity is very big and still growing. This makes it interesting to a company like Intel, even though we aren’t a major player in the business today, to continue to invest in it. We see a trend going from GPS to GNSS and on to location, and now the big opportunity is indoor location. But this indoor-location market is not a stand-alone device opportunity. Indoor location requires this kind of technology inside other devices, inside phones and tablets and IoT types of things.
Context. Let’s look at indoor location as a feature in a larger portion of product. That idea comes from the requirement for location not just for the location itself, but in order to provide context. That’s critical because now these smart, mobile devices are not just used to make phone calls, but are used all the time. As a result, many applications running on them really require that location context to determine the most likely use case that the device is currently operating, making the consumer experience easier and more natural. This is evident throughout the entire value chain from phones and tablets to wearables. If you think about that from a requirement standpoint, you see the major places where GNSS has enabled trend changes in the market.
Let’s step back a bit in history to go through FIGURE 1, the opening figure, horizontally. In the early 2000s when I was at SiRF Technology, the main market drivers were personal navigation devices (PNDs). There were all these dashboard-mounted PNDs, and the main things we were trying to fix was the urban-canyon problem. GPS always worked well in the rural areas but always had trouble in urban canyons; to fix that, we had to improve the sensitivity. The solution in that timeframe was with multi-correlator designs and improved RF frontends; we were able to improve the sensitivity of the receivers by a good 5–10 dB, which enabled us to really keep the antennas inside the car so that there was no need for roof-mounted antennas. The PND could be mounted on the dash and work just fine. That was a big factor in improving the user experience. The secondary specification that enabled that market to grow quickly was time-to-first-fix; those devices had to power-up and work fast to prevent user frustration.
Within about five years, however, the PND market was overtaken by growth in the feature phone market. The reason for that was the FCC E911 mandate; everyone had to figure out a way to make sure that phones sold in the United States had the ability to meet that 911 mandate. GPS was one of the major methodologies in meeting that, and the main driver there was not around sensitivity, it was improving first-fix times. The mandate required a 30-second TTFF implementation in a very challenged environment to support emergency-services dispatch. This led us to the development of assisted GPS (AGPS) and further integration into phones. We had a secondary requirement of continuing to improve the sensitivity, because now we had to deal with an even worse antenna in a handset.
Once that was taken care of in the mid 2000s, the next thing we saw coming — and what’s coming now — is the change in GPS requirements for smartphone navigation. This comes from the huge growth of higher end smartphones that are running multiple applications driving the use-cases around LBS. How will the location be used to provide services, now that we can provide applications on that platform? Now the most important specification has become active power? Every time a GPS receiver is turned on for use in an LBS mode, you have to make sure that the power consumption is kept to a minimum, or no one will use those services. So the active power of the device became a very important specification that we were all trying to improve.
The secondary specification we had to improve was the availability. This is where the advantage of multi-GNSS started to show up — using handsets for car navigation on Google map types of implementations. So the performance of smartphone navigation in the urban canyon became a big driver recently as the main use case.
Impacts of New Requirements on Silicon Design
- Standby power reduction impacts
- SRAM is the leakiest component of typical design
- Needs to be reduced or ideally eliminated
- Non-continuous fix methods
- Ability to quickly save and restore state information
- Hybrid location solutions
- Support measurements from multiple radios
- Need to share radios, not duplicate chains
- Increased integration of of multiple radios on single die
- Need more interference rejection capability
- Ability to support concurrent radio operation on single die
Next! What’s coming next is the idea that these wearables and IoT platforms are not just doing LBS on demand because of the currently active application. They are going to need continuous location. The device needs to provide location capability all the time, but it’s not necessarily going to be noticed by the user or activated by the user, so the specification that becomes important is energy per day. You want to make sure your device can maintain its location without draining its battery. Then we are also going to have to increase the availability of location into indoors to really fix this whole problem. And that will really move us into hybrid capability.
If we look at those changes in the market and we look at how they’re going to impact the GNSS architecture, the first thing we want to look at is: Where is GNSS? FIGURE 8 is a plot that I’m sure everybody has and is hard to keep up to date. It looks at the satellites coming from the different satellite constellations. The important thing here is that we are approaching a timeframe where a significant uptick in the growth of satellites can send the numbers over 100. That can really have an impact on receiver design, if you’re building a multi-GNSS receiver and you have to deal with a hundred satellites. How are you going to do that?
FIGURE 9 shows the relationship between the coherent period and the number of correlators required to search for one satellite in each constellation. We looked at particular scenarios — in this case, let’s say we are trying to do an outdoor location, so –130 dBm cold start test (FIGURE 10) with an initial frequency certainty of around 1 part per million (ppm). We wanted to look at the impact of the different constellations on doing that, and what it takes inside of the receiver to implement it. I’m not going to go into great detail here. But looking at those impacts in correlator counts, you can see the difference between building a GPS receiver that can do this and building a Galileo receiver that can do this. From the simplest one, that is, GLONASS, and from the most difficult one, which is Galileo, you see a 75x difference in the number of correlators required to do that, based on signal structure. This would indicate that, maybe from a cold start fix point of view, you might prefer a GLONASS implementation, and do GPS or Galileo later.
If that specification was your primary concern, then you would look at how those requirements got implemented into those devices. In addition, you try to come down to these low levels of power consumption, maintain sufficient accuracy to support these applications, and be able to move this into a very small form factor. If we look at the relationship between the number of correlators required to search for each satellite and amount of silicon area that requires, we see a big difference in the growth of those, depending on which constellation you look at. But if you look at a hot start scenario (FIGURE 11) rather than a cold start and at a weaker signal level, which is the more common implementation in devices today, you see a different result. With an improved starting condition because we have better information on the oscillators and reduced other uncertainties producing a smaller search space, the silicon area impact is greatly reduced. Then we have to really look at reducing standby power. That means we need to look at static random-access memory (SRAM) because SRAMs are a horribly leaky component and create very large standby power, but they are what we’ve been using for years in the standalone GPS world.
We also have to look at non-continuous fix methodologies: this idea of turning things on and off to save power, which relates back to the standby power issues. We also have to look at hybrids: How are we going to support measurements from multiple radios like Wi-Fi and Bluetooth that are becoming important for indoor location? How are we going to share those radios without just pasting them together? That involves integration onto single die, and looking at what happens on the silicon level, and at what happens when you try to run radios at the same time.
What we have to work with, especially here at Intel, the home of Gordon Moore, is Moore’s Law. It is still working 30 years after it was proposed. Recently, we see that we are tracking this progression of constantly reducing device sizes and moving forward. The dates in FIGURE 12 are for the process technology nodes associated with a classical digital process. We are not at the 22-nanometer level today on GPS receivers, but we are moving down that curve.
Obviously, when you move down that curve, you greatly increase your ability to add more gates to improve TTFF and sensitivity. More correlators help you search out more uncertainty faster. The other thing this does is allow us to run faster, to up the central processor unit (CPU) clockspeed. This allows more software capability to do things like process more advanced navigation algorithms, bring in more satellites from multiple GNSS, run very expansive Kalman filters, and look at hybrid technologies. It has also driven down the power, so that reducing the active power requirement that we had was kind of coming along with Moore’s law without a whole lot of effort.
But now we’ve run into a problem: the parameter that we care more about, standby power, is actually going up. Although we are getting benefits out of Moore’s Law from speed and active power, we are actually having a problem. It’s increasing our standby power, which makes it difficult to go to these lower fix rates with faster restarts.
You see a trend here. As you move down in technology nodes, you find that the more advanced technology nodes are less applicable to the smaller multi-purpose devices. This is part of the reason why you don’t see the mobile phone devices coming down as fast as you see the desktop devices coming towards those new technology nodes.
This means some really significant silicon design challenges. We need to figure out how to take the advantages of Moore’s Law and maintain the benefits of smaller geometry, we need higher clock-speeds, and we need more memory for multi-constellation methodology and that gets lower active power and smaller size.
But we have to figure out a way to not give up our standby power when we start moving down into these very small geometries. That will require some new methodologies, both at the chip level in terms of how we build silicon, and at the system design level, in terms of how we put these things together inside a mobile phone.
What Intel Is Doing
I can’t tell you what we haven’t done yet, but we look at location as an opportunity where the strength of Intel comes into play. We have very advanced silicon processors and we are bringing those to bear on the location technology problem — just starting in the last few years. Our goal is to provide a GNSS and location silicon solution with best-in-class performance based on Intel technology. Once we’ve done that at the silicon level, we’ll look at bringing the platform-level integration capability together.
We have the ability to merge multiple location technologies. We have a platform-level capability to integrate hardware and software to solve the indoor location problem on a variety of platforms. To execute to Intel’s vision, we’re going to push this into a ubiquitous technology present in all these devices, so that we can improve the variants on these mobile products.
Multiple Radios. That’s part of what’s driving the whole industry towards the kind of consolidation that we’ve seen: stand-alone chipsets are not the only (or even the preferred) way to solve this problem. Without some access to the system design level, we’re not able to solve this problem for mobile phones and IoT type devices. We’re going to see this trend — that we all see coming — of putting multiple radios onto a single die, because that does reduce cost and size as we try to get into watches.
The 2015 Consumer Electronics Show brought out the new stuff. They’re talking about IoT buttons. We still have a ways to go; bringing that capability down to that size in a GNSS radio is a difficult problem. Once we start incorporating these different radios, such as Wi-Fi and Bluetooth, into this solution, we run back into the problem of the value chain: How to get everyone aligned in a device with these capabilities into a single unified solution?
One of the problems a lot of us see with these mobile products is that they have a lot of application and they require a lot of interaction. We’d all like these devices to become smarter and present the information that we want, when we want it. A big part of that is the location context, and so that’s what we’re planning on doing: integrating that location context into all these platforms so that these smart connected devices can be even smarter and provide a better user experience.
GREG TURETZKY is a principal engineer at Intel responsible for strategic business development in Intel’s Wireless Communication Group focusing on location. He has more than 25 years of experience in the GNSS industry at JHU-APL, Stanford Telecom, Trimble, SiRF and CSR. He is a member of GPS World’s Editorial Advisory Board.
The statements, views, and opinions presented in this article are those of the author and are not endorsed by, nor do they necessarily reflect, the opinions of the author’s present and/or former employers or any other organization with whom the author may be associated.
This article is based on a GPS World webinar, which sprang from a presentation at the Stanford PNT Symposium. Listener questions and Greg Turetzky’s answers during the webinar, which can be read here.
The author would like to acknowledge the contribution of Figures 9, 10 and 11 from the paper “Optimal search strategy in a multi-constellatoin environment” by Intel colleagues Anyaegbu et al, from ION GNSS+ 2015.
Follow Us