PNT Roundup: Measuring range of motion, new app offers urban guiding
Measuring recovering patients’ range of motion
Positioning on a micro-scale is the task of a new sensor that reports on range of motion (ROM) achieved in stretching exercises and other post-operative activities by at-home patients after discharge from hospital. Telit, a company active in sensors for the Internet of Things (IoT), announced that U.K.-based 270 Vision Limited has selected Telit’s BlueMod+SR Bluetooth module for its BPMpro Mark 2 sensor for remote, precision measurement of patient ROM.
The BPS (Body Performance Measurement) wearable sensor is a medical device that measures patient ROM before and during rehabilitation. Post-surgery, patients are discharged to be remotely monitored at home as they undertake their daily routine using a BPMpro sensor. The captured sensor output displays on a patient tablet running BPMpathway software and streams live to the clinician, who can use this data to assess the patient’s progress. With the patient recovery data collected by BPMpathway, clinicians can tailor an orthopedic patient’s post-operative support to meet their individual needs, without having to wait for a face-to-face consultation.
The BlueMod+SR module is a very small form factor dual-mode Bluetooth 4.0 module (17 x 10 x 2.6 mm). Range in line of sight is about 100 meters. Dual mode means it supports classic Bluetooth basic rate (BR) and enhanced data rate (EDR) operations as well as Bluetooth low energy (LE).
New app promises better-than-GPS urban guiding
Attention, GPS World readers living in or visiting Central London, Mountain View, California or San Francisco. Public beta of a new app (iOS only) will take you by the hand — er, phone — and lead you around the fair city; those cities only, at present. More intriguing, it claims to deliver “better than GPS accuracy.”
This augmented reality (AR) navigation employs an AR-native framework, Apple’s ARKit. Blippar touts its AR City app for something it calls “urban visual positioning…which localizes users with higher accuracy than GPS, thanks to computer vision.” The app uses visual inertial odometry, interpreting movement seen through the camera, to minimize position errors, that is, in a sense, to correct GPS.
Holding the phone in front of one’s face in tourist fashion, the user receives nearby points of interest based on what he or she can actually see. The app feature three layers of information:
AR Basic Navigation. A visualization of walking routes through augmented reality.
Enhanced Map Content. Overlays of information and content related to user location in AR — for example, streets and points of interest.
Urban Visual Positioning. Recognition, positioning and directional information via computer vision.
AR basic navigation, available everywhere that is supported by Apple Maps, visualizes routes with arrows shown in augmented reality. AR basic navigation uses GPS to estimate the absolute position of the user, and visual inertial odometry (VIO) to track their local movement. Blippar integrates GPS and VIO by building on the ARCL library, which uses Apple’s ARKit for VIO and Core Location for GPS.
GPS alone doesn’t give a high enough level of accuracy when looking at the map and has an average error rate of 16 m in cities, according to Blippar, which claims its urban visual positioning provides more than twice its accuracy.
Follow Us