A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation

‘Data clouds fusion’ helps robots work as a team in hazardous situations

  • Date:
  • 2020-05-26
  • Visited:
  • 28
  • Share:

A group of researchers and engineers has created a new way for robots to pool data gathered in real time, allowing them to ‘think’ collectively and navigate their way through difficult, previously unmapped obstacles as a team.

Paper Information

M. Ivanov, O. Sergyienko, V. Tyrsa, L. Lindner, W.  Flores-Fuentes, J. C.  Rodríguez-Quiñonez, W. Hernandez and P. Mercorelli, "Influence of Data Clouds Fusion From 3D Real-Time Vision System on Robotic Group Dead Reckoning in Unknown Terrain," IEEE/CAA J. Autom. Sinica, vol. 7, no. 2, pp. 368-385, Mar. 2020.



Not long ago, the concept of artificially intelligent robots mimicking animal herd-like intuition and cooperation to accomplish a mutual task was limited to the realm of science fiction and our imaginations. But a recent joint effort among international researchers demonstrated that robots working together can navigate unknown terrain faster than they would as individuals.

At the heart of the researchers’ new group robotic navigation system is a centralized data cloud. Each cloud-linked robot draws on data gathered in real time from all of the other robots, applying logic algorithms to help them steer clear of paths more likely to contain obstacles.

The group took a three-pronged approach to build out the decision-making capacity of their group of robots. “We presented a solution of combined dead reckoning, data transferring, and machine vision, based on our research group's original laser-based real-time technical vision system,” said Mykhailo Ivanov, an engineer and one of the lead authors of the study at Universidad Autónoma de Baja in Mexico. Previous efforts, he added, focused on each problem separately.

The study deployed simple four-wheeled robots onto obstacle courses designed to have unique blind spots for each unit. The robots’ ‘eyes’ were based on a simple laser and pair of sensors that evaluated the reflected light for position and distance. The team chose laser vision rather than digital video cameras because the laser system can operate in total darkness and is less expensive.

The ability to move through difficult terrain may be helpful in a variety of settings where there is a need to collect data, yet the environment is too small or too dangerous for humans. “Here we have earthquakes very frequently,” Ivanov said of California. “So such robotic groups, equipped with our technical vision system, could monitor buildings’ structural integrity as well as speed rescue efforts following a catastrophic event.

The next step in this research, Mr. Ivanov said, will be to improve upon the robotic vision, which would make their cloud-based navigation teamwork system potentially useful across a wider array of industries and applications.

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor 2019: 5.129
    Rank:Top 17% (11/63), Category of Automation & Control Systems
    Quantile: The 1st (SCI Q1)
    CiteScore 2019 : 8.3
    Rank: Top 9% (Category of Computer Science: Information System) , Top 11% (Category of Control and Systems Engineering), Top 12% (Category of Artificial Intelligence)
    Quantile: The 1st (Q1)