Synthetic Sensors use AI for the Smart Home

I just came across a fascinating Carnegie Mellon University project. These sensor boards (CMU calls them Supersensors) use a plethora of different environmental sensors, such as:

  • Radio interference
  • Electromagnetic Noise (probably the same sensor as above)
  • Magnetism
  • Motion X/Y/Z
  • Light color
  • Illumination
  • Air pressure
  • Humidity
  • non-contact temperature
  • Ambient temperature
  • Acoustics
  • Vibration

The sensor data is fed into an AI that is trained to recognise events by their sensor signature, such as turning on a faucet, operating a microwave oven or even counting the number of paper towels used from a dispenser (Facility Managers listen up!).

While the project claims that the AI runs locally, my prediction is that - with the exception of large FM companies - most of these supersensors will likely feed event data into a cloud-based AI that is pre-trained in thousands of event types and continually learns from new signatures it receives.

While smart home automation is a great field for sensors like these, I see big advantages for healthcare as well. Attach one of these over each intense care bed and doctors as well as nurses - and most of all the patient - will benefit from registering key events such as shivering, shifting in bed, etc. Care-at-home patients will benefit just as much.

As with any data going into the cloud, I hope the Carnegie Mellon team is taking care to make sure there is no data being sent out that can be directly attributed to a household or an individual. Hack into one of these sensors and you'll figure our very quickly if someone is at home or not!

Where can you get your hands on one?

Well, the concept was just presented as a paper at CHI2017, so we're not talking ready-for-market devices at this time.

Have a look at the project homepage, there are more details on the work done here. You can also download the paper from the acm.org website.

Year

Categories

Tags