Researchers from the Robotics, Computer Vision and Artificial Intelligence (RoPeRT) group at the Aragon Institute for Engineering Research (I3A), together with researchers from Stanford University, have created CineMPC, a revolutionary cinematographic system that takes drone filming to new heights. A technological advance that not only controls the position and orientation of the drone, but also the intrinsic elements of the camera - focal length, focus distance and aperture - opening the door to a wide range of applications and facilitating the work of the director (or amateur users) who will be able to incorporate the drone into their work equipment.
Until now, all cinematic drone solutions have only controlled elements that involve how the drone is positioned relative to the person, for example, filming from three metres away, at a 45° angle or from the right side. The system created by this group of researchers is the first solution that autonomously integrates the internal parameters of the camera.
"This system not only follows directions to move, but also adjusts focus, depth of field and zoom to capture images according to different artistic preferences," says Pablo Pueyo, a researcher at I3A. He also points out how CineMPC re-optimises the trajectory at every step in response to changing scene conditions in real time. "The system's perception tracks multiple targets, even in challenging situations, such as fast movements or difficult lighting conditions."
Pablo Pueyo acknowledges that "in robotics there are a large number of solutions that are created and are functional for simulation. The real challenge is to implement these solutions in real robots, with all the problems that these systems, particularly in the field of drones, can entail. The researcher says that by conducting experiments in simulation, "you can take recordings in more challenging environments; on the other hand, by implementing the solution in a real robotic platform, we have demonstrated that this platform is capable of functioning in a drone and is therefore viable".
Recreating very iconic film footage is one of the applications that, as Pablo Pueyo explains, CineMPC can bring to content creators and filmmakers, "an example we always mention is the Dolly Zoom or Vertigo Effect, introduced by Alfred Hitchcock in the 1958 film "Vertigo". This effect is achieved by controlling the intrinsics of the camera - specifically the focal length. Without this control, the effect is not reproducible on a cinematic drone platform." In addition, the system makes it possible to make recordings taking into account other footage by applying the parameters of that footage.
Beyond cinema
However, the researchers are considering using this control of the camera internals for other fields than just cinema, for example animal monitoring in natural areas. "It is difficult to use drones for wildlife monitoring from close-up views. Drones are usually very noisy, so if the drone gets too close to the animals, they get scared. Thanks to the camera's automatic zoom control, we can record them from far away, taking recordings of them from close points of view without having to get physically close to them," says Pablo Pueyo.
Looking to the future, the main challenge the researchers face is to democratise the use of these drones through more user-friendly interfaces. In fact, one of their latest projects1 is to introduce an interface to the platform that is able to extract different styles and characteristics from a video provided by the user, and send the instructions to the drone. "If you are watching a film and you like a certain cut, you can enter that recording into the system, which extracts the relevant information and sends the instructions to the drone, which moves and controls itself to recreate these effects," describes Pablo Pueyo.
Apart from this point, a good challenge for the researchers is the use of several drones at the same time, filming a person from different perspectives and having the drones automatically decide which one is the best by coordinating with each other. "That would be an extension to multi-robot that we are considering."
AirSim and ROS (Robotic Operating System) support
The CineMPC code can be run on the AirSim simulator, offering a photorealistic experience to test and refine filming techniques before taking them into the real world. The researcher argues that this system provides that flexibility, "the simulation allows you to try more things and in scenarios that are more difficult to achieve with a real drone".
As for ROS, the operating system used in robotics, as Pablo Pueyo explains, "most of the robots we can see, at least in the field of research, use ROS. The fact that CineMPC is compatible with this system makes it more versatile, as it doesn't have to be adapted to a particular platform or drone".
Together with Pablo Pueyo, Juan Dendarieta, Eduardo Montijano, Ana Cristina Murillo and Mac Schwager, they are the researchers who have developed this cinematographic system.
1Article: CineTransfer: Controlling a Robot to Imitate Cinematographic Style from a Single Example. https://ieeexplore.ieee.org/abstract/document/10342280
Article in Tech Xplore: A fully autonomous drone system for cinematography and wildlife monitoring: https://techxplore.com/news/2024-01-fully-autonomous-drone-cinematography-wildlife.html
Video with experiments: CineMPC: Experiments with real drones and simulations
Photo: Eduardo Montijano, Pablo Pueyo y Juan Dendarieta.