IoT Mashup #004: Action at a Distance
A deeper dive into how the IoT is transcending physical boundries with mashups
March 21, 2018
One of the more interesting aspects of the Internet of Things is its ability to build digital bridges between real-world locations. When the people, objects and spaces in two distant places can be brought together in a virtual space, it opens up new ways of working and interacting.
Several companies, projects and artists have started to explore how global networks and virtual reality can mediate between sensors on one end and actuators on the other — making it truly possible to “be” in two places at once.
One of the simpler forms of this concept is the telepresence robot, with which I’ve had first-hand experience. Made by companies like Double Robotics and Suitable Technologies, these ‘bots combine the now-familiar video chat experience with a rolling, remote-controlled tablet stand. Though currently somewhat clunky, they point toward a future in which more maneuverable (and possibly more humanoid) robot avatars allow us to attend meetings, conferences, and other distant events in a fairly natural way.
More intricate robots also make it possible to actually get stuff done on the other end of the virtual link. Artist Alex Kiessling experimented with this in 2013 with the project LongDistanceArt, which involved two industrial robot arms that mirrored his movements to create three simultaneous and near-identical artworks — with the artist and his mechanical helpers each in a different European city.
Meanwhile, the communications technology company Ericsson is looking beyond art and into the workplace. At the 2014 Mobile World Congress, Ericsson Research unveiled a system for controlling heavy machinery through a VR headset equipped with video feeds from a distant worksite. While it was only a sandbox demonstration — literally, with a remote-controlled toy excavator — it suggests that telepresence ‘bots may open up new global markets for skilled labor.
Of course, for certain tasks, no robot can yet beat the human hand. Just last month, the good folks at PubNub posted an open-source project that uses a Leap Motion gesture controller and a Raspberry Pi to make a robot that can “follow” a distant user’s hands. The motion-tracking is pretty basic: The little robot appendages follow the whole hand from side-to-side or up-and-down. However, the Leap device can track fairly precise movements of all ten fingers at once — so it’s not much of a leap to imagine a project in which a human’s hands, waggling in empty space, could become the deft maneuverings of a full-fingered robotic counterpart on the other side of the planet.
But perhaps the most mind-bending physical-digital-physical interactions are those that happen entirely through an abstract virtual world. The company Space-Time Insight is working on applying VR as a management tool for remotely connected industrial equipment. Earlier this year at the DistribuTECH conference, the company showed how an Oculus Rift VR headset could be a portal into a simulated industrial site: Instead of direct video feeds of the actual space and equipment, users would move through a computer-generated space in which each piece of machinery is represented according to its real-time diagnostic data and can be controlled from within the simulation. The idea is that virtual gauges, warning lights, levers and switches could be a more intuitive way to check that things are running smoothly than staring at graphs, spreadsheets and drop-down menus in a web browser.
With continuing improvements in connectivity, robotics, VR and the Internet of Things, the boundaries between the digital and physical realms are going to look thinner and thinner.
Related: Interview with PubNub’s Tom Greene