A group of researchers at New York University has launched an iOS app called AnySense for gathering visual training data for robotics models.
The AnySense app is a part of the Robot Utility Models (RUM) project, which aims to generalize training for robots and allow them to perform in new environments.
The app integrates the iPhone's sensors with external multisensory inputs, interfaces with the versatile tactile sensor called AnySkin, and is fully open-source and available for the robotics community.
The AnySense app is now available for download from the Apple App Store.