A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation

【Featured Article】Image processing algorithm allows indoor drones to fly autonomously

  • Date:
  • 2020-08-19
  • Visited:
  • 24
  • Share:

A research team from Japan has developed a single-camera machine vision algorithm, making it possible for lightweight hovering indoor robots to guide themselves by identifying and interpreting reference points on a tiled floor. The technology opens the door to a new breed of functional, low-cost drones with potentially wide-ranging uses.


Paper Information

C. Premachandra, D. N. H. Thanh, T. Kimura and H. Kawanaka, "A Study on Hovering Control of Small Aerial Robot by Sensing Existing Floor Features," IEEE/CAA J. Autom. Sinica, vol. 7, no. 4, pp. 1016-1025, July 2020.





Since GPS signals are too weak to penetrate most structures, indoor drones must rely on environmental queues, which are typically visual. A drone designed for indoor use is likely to be smaller and lighter than an outdoor drone, according to Premachandra. “We considered different hardware options, including laser rangefinders,” he said. “But rangefinders are too heavy, and infrared and ultrasonic sensors suffer from low precision. So that led us to using a camera as the robot’s visual sensor. If you think of the camera in your cell phone, that gives you an idea of just how small and light they can be.”

Premachandra’s research team aimed to design the guidance algorithm to be as simple as possible, allowing for a small, inexpensive microprocessor. The team used the Raspberry pi3, an open-source computing platform that weighs approximately 45 grams.

Their study prototype had a single downward facing camera with intentionally low resolution – only 80 by 80 pixels. “Our robot only needed to distinguish its direction of motion and identify corners. From there, our algorithm allows it to extrapolate its position in the room, helping it avoid contacting the walls,” Premachandra said.

The team’s program worked by taking each 80x80 image through a series of simple processing steps ending in a black and white grid, making it easier to quickly identify motion along the X and Y planes.

The guidance method is limited because it relies on a room with a tile floor and predictable patterns. Premachandra said next steps in research into lightweight, autonomous-hovering, indoor robots might include adapting the technology for infrared cameras so they could function in the dark, as well as adding a second camera so the robot could visually determine both its X,Y position and its altitude in the room.

“There are many potential applications,” Premachandra said. “Hovering indoor robots may be useful in warehouses, distribution centers, and industrial applications to remotely monitor safety.”

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor 2019: 5.129
    Rank:Top 17% (11/63), Category of Automation & Control Systems
    Quantile: The 1st (SCI Q1)
    CiteScore 2019 : 8.3
    Rank: Top 9% (Category of Computer Science: Information System) , Top 11% (Category of Control and Systems Engineering), Top 12% (Category of Artificial Intelligence)
    Quantile: The 1st (Q1)