Visual Light Landmarks for Mobile Devices and Tags

This project aims to develop a communication system for LED lightbulbs to transmit data to mobile smartphones with cameras, and tags with photosensors.

Lights as location landmarks

A motivating application for communication from the lights to mobile devices is indoor location estimation with the lights as landmarks. If the location of lights are known, a mobile device or a tag can estimate its location based on the signal strength of the lights in the proximity.

Why lights as landmarks?

  • Lights are omnipresent indoors and are typically well positioned for sensing applications.
  • The spatial confinement of lights makes them ideal for semantic localization.
  • Solid-state lighting (SSL) or LED lights are phasing out incandescent and CFL lights, as per mandates and regulations in the recent years. LED lights already Pulse-Width Modulate (PWM) for thermal efficiency reasons. We modulate data on the high-frequency PWM signal.

Challenges


For any visual light landmark system to be practical and easily adoptable, there are a few challenges:
  • Multiple lights should be supported in a single space without collision
  • The system should be able to scale to thousands of lights in a building
  • In case of a mobile device as receiver, for the sake of convenience, the user should not be required to point the camera at the light. It is harder to detect the image bands from the non line-of-sight images, than the images taken of the light directly.
  • Any additional hardware on the lighting side or the receiver side is undesirable.

Detecting a pulsing light with a camera

Here are some images of surfaces illuminated by LED lights that are pulsing at very high frequencies (such that you cannot perceive the flicker). The image banding is due to the rolling-shutter sensors present on mobile cameras. Our system exploits this characteristic of cameras to detect lights. If you would like how the rolling shutter works and how such images get created, have a look at this video that we made : http://youtu.be/NHNSTil34xA

A simple communication scheme

A simple scheme to identify lights uniquely is shown below. Each light operates at a unique frequency, which produces corresponding bands in an image. By detecting the frequency of the image bands along the vertical dimension, the light can be identified. If a tag with a photosensor is used, the frequency of the light intensity signal can directly be analyzed. If multiple lights are present, then the multiple frequencies can be uniquely detected as long as they are not harmonics of each other.

The limitation with this simple scheme is limited bandwidth. The figure below explains the bandwidth constrains for a camera as the receiver. However, if a high-sensitive photodiode with a wide-bandwidth frequency response is used, the bandwidth constraint might be much lesser.

Our communication system

Modulation

The modulation scheme we arrived at is Binary Frequency Shift Keying modulation. Each light uses a different frequency to transmit the preamble or start-of-data, bit 1 and bit 0. In our implementation, the lights were synchronized wirelessly, which enabled us to reuse the preamble frequency across all lights. All lights can use the same frequency to transmit bit 0 since our scheme decodes bits by detecting absence or presence of bit 1 frequency.

Here is a video is what the modulated light would look like in real : http://youtu.be/ZjvfugLtOc4. Note that the human eye would be imperceptible to this and would not notice any change in illumination since the light is operation much higher than the human flicker threshold. However, our camera sees this light very differently, and this video shows you how the camera captures it.

Demodulation

The demodulation process involves extracting the time-varying light signal from the sequence of spatially-varying images. Since the camera frames and the light data symbols are not synchronized, we use a sliding-window approach where we concatenate all the images and analyze the frequency changes over the images. The figure below explains our demodulation process.

This process is better explained by this video : http://youtu.be/SFCzH4CotPo. This video shows an animation of the demodulation process done on images acquired from our prototype setup.

Prototype of our demo

Here is a photograph of our prototype demonstration setup built with Cree 9W LED lights. We replaced the in-built LED drivers with driver boards that we designed. Each light is controlled by a FireFly microcontroller. The user can turn on and off lights and point the mobile device (iPad or iPhone) anywhere in the vicinity of the light. A GUI on the PC shows the signals received from the lights and the lights detected.

Additional Information

This work was presented at IPSN 2014. Here is the paper.

Hybrid VLC

Details coming soon..
Meanwhile, here is the paper that was presented at the VLCS Workshop
Here is the two-page write-up for our poster on Is there a place for VLC in Wireless Sensor Networks?

VLCoverview.png (347 KB) Niranjini Rajagopal, 05/06/2014 03:35 pm

RollingShutterImg1.png (189 KB) Niranjini Rajagopal, 05/08/2014 01:13 am

RollingShutterImg2.png (449 KB) Niranjini Rajagopal, 05/08/2014 01:13 am

RollingShutterImg3.png (178 KB) Niranjini Rajagopal, 05/08/2014 01:13 am

SimpleComm1.png (481 KB) Niranjini Rajagopal, 05/08/2014 02:46 am

SimpleComm2.png (476 KB) Niranjini Rajagopal, 05/08/2014 02:46 am

Challenges.png (946 KB) Niranjini Rajagopal, 05/08/2014 03:15 am

Modulation.png (201 KB) Niranjini Rajagopal, 05/08/2014 03:31 am

CameraFreqResp.png (121 KB) Niranjini Rajagopal, 05/08/2014 03:42 am

ChannelSpace.png (65.2 KB) Niranjini Rajagopal, 05/08/2014 03:42 am

DemoSetup.png (560 KB) Niranjini Rajagopal, 05/08/2014 03:51 am

Demodualtion.png (346 KB) Niranjini Rajagopal, 05/08/2014 04:37 am

vlc-workshop.pdf - VLCS workshop paper (Hybrid VLC) (1.04 MB) Niranjini Rajagopal, 12/23/2014 03:24 pm

vlcs-poster.pdf - VLCS workshop poster (106 KB) Niranjini Rajagopal, 12/26/2014 03:59 pm