News

App creates time-lapse videos with a smartphone

02.12.2022 - The freely available app has three capture modes that cover a range of scenarios.

An app developed by Cornell researchers uses augmented reality to help users repeatedly capture images from the same location with a phone or tablet to make time-lapse videos – without leaving a camera on site. Time-lapse photo­graphy, which involves combining photos taken over long periods of time, provides a powerful way to visualize phenomena such as changing seasons or the movement of the sun. Tradi­tionally, photo­graphers would leave a camera on a tripod for the duration of the event, but researchers working with Abe Davis in the Cornell Ann S. Bowers College of Computing and Infor­mation Science have developed a more convenient method. Their iOS app, ReCapture, is now freely available in the Apple app store.

The app has three capture modes that cover a range of scenarios. One works best for landscapes, one helps capture close-up scenes and a third collects a range of images that can be used to reconstruct the scene in 3D offline. Each capture mode uses different infor­mation about the scene. The simplest mode uses an overlay of previous shots to help the user line up new photos. For close-up scenes, which tend to be more difficult to capture, the application tries to figure out where the camera is in 3D space and uses arrows to tell the user how to move and tilt their phone toward the correct location.

Abe Davis had been envisioning a project that would help field researchers repeatedly find and re-photograph precise locations from their field sites to track any changes. Together with Ruyu Yan he came up with the idea of “geocaching with pictures,” which ulti­mately evolved into ReCapture. “Geocaching may be something that people are doing for fun, but if you’re a scientist and you're doing field work, then there’s a similar kind of problem at play,” Davis said. 

Yan said the hardest part was developing the app interface to guide users through the process, because “what works intuitively for me may not work intuitively for others.” She sought feedback from 20 beta testers and also worked with the XR Colla­boratory at Cornell Tech, which advises researchers on augmented reality, virtual reality and mixed reality appli­cations. Addi­tionally, she had to figure out how to manage the mountains of data associated with the photos. “The app used to crash a lot,” she said. This was a problem because if the app was too slow or constantly crashing, people wouldn’t collect enough footage, leading to jerky, poor quality videos.

In future versions of the app, Davis thinks they may be able to smooth out gaps and abrupt tran­sitions in the footage using recent machine learning techniques, which would yield higher quality videos. Besides making gifs and videos, the app may also have valuable scientific appli­cations, as Davis had envisioned. The team has shared the app with field researchers in other depart­ments at Cornell, and colleagues in the School of Inte­grative Plant Science in the College of Agriculture and Life Sciences have already begun using it to collect data. (Source: Cornell U.)

Reference: R. Yan et al.: ReCapture: AR-Guided Time-lapse Photography, UIST '22: Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, 36, 1 (2022), DOI: 10.1145/3526113.3545641

Link: Project ReCapture, Dept. of Computer Science, Cornell University, Ithaka, USA

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier