DETECTION OF OBJECTS ON THE IMAGES OBTAINED FROM THE SPATIAL RANGEFINDER

Abdulkadhim Hussein Abdulameer,
University of Diyala – College of Engineering, Diyala, Iraq

DOI: 10.36724/2664-066X-2023-9-3-9-13

SYNCHROINFO JOURNAL. Volume 9, Number 3 (2023). P. 9-13.

Abstract

Over the past few years, entire families of new recording devices have emerged that allow three-dimensional object tracking, localization, and surface reconstruction. This has led to an increase in a variety of practical applications. The most typical such application to date is digital three-dimensional models of the environment obtained using a variety of sensor devices such as ultrasonic, laser and infrared rangefinders. The analysis performed showed that the most effective procedure for identifying objects in spatial rangefinder images is the procedure based on the analysis of partial line-by-line histograms of the image. The essence of this method is to calculate the brightness values in each line of the image and count the number of these values in the entire line. This procedure is based, among other things, on the results of visual analysis of images obtained from a spatial rangefinder. This analysis shows that the boundary of an object is usually quite clearly distinguished in the upper part of the object, conventionally “rising” above the horizon. Detecting objects in an image is a difficult task because of the proximity of the color values of the pixels, and there is no specific method that allows to provide an absolute precision in the results of processing of the image sequence. In this paper, proposed to use the method of histogram analysis to detect objects on a sequence of images obtained with the help of spatial rangefinders.

Keywords Spatial Range Finder, Histogram Analysis Method, Object Detection

References

[1] K.K. Vasiliev, V.E. Dementyev, H.A. Abdulkadim, “Analysis of the trajectory of an autonomous vehicle based on the results of processing a sequence of images,” REDS. 2016.

[2] V.E. Dementyev, H.A. Abdulkadim, A.G. Frenkel, “Development and analysis of algorithms for estimating the trajectory of autonomous aircraft based on the results of processing images of surrounding objects,” Radio engineering, 2016, No. 9, pp. 28-31.

[3] V.E. Dementyev, H.A. Abdulkadim, “Algorithms for estimating the coordinates of aircraft using spatial rangefinders,” Automation of control processes. 2017. No. 1, pp. 51-57.

[4] Kh.A. Abdulkadim, K.K. Vasiliev, V.E. Dementyev, “Analysis of algorithms for estimating the trajectory of autonomous vehicles,” Reliability and quality of complex systems, 2016. No. 4 (16), pp. 113-120.

[5] M. Alwan, M. Wagner, G. Wasson, P. Sheth, “Characterization of Infrared Range-Finder PBS-03JN for 2-D Mapping,” International Conference on Robotics and Automation (ICRA). 2005, pp. 3936-3941.

[6] Jose Jimenez, Jesus Urena, Manuel Mazo, Alvaro Hernandez, Enrique Santso, “Three-Dimensional Discrimination Between Planes, Corners and Edges Using Ultrasonic Sensors,” Proc. ETFA/IEEE Conf. Vol. 2, pp. 692-699. Sept. 2003.

[7] Cang Ye, Johann Borenstein,” Characterization of a 2-D Laser Scanner for Mobile Robot Obstacle Negotiation,” Proc. IEEE Int. Conf. On Robotics and Automation. Washington, DC. December 2002, pp. 2512-2518.

[8] Stefan Morgott, “Range Finding Using Pulse Lasers,” Company of Osram Opto Semiconductors, pp. 1-7. September 2004.

[9] C.F. Bergh, B.A. Kennedy, L.H. Matthies, A.E. Johnson, “A Compact Low Power Two-Axis Scanning Laser Rangefinder for Mobi[le Robots,” Jet Propulsion Laboratory California Institute of Technology, Pasadena, CA. 2000. P. 6.

[10] Francois Blais, “Review of 20 years of range sensor development,” Journal of Electronic Imaging. Vol. 13(1), pp. 231-240. January 2004.

[11] Grzegorz Jóźków, Charles Toth, Zoltan Koppanyi, Dorota Grejner-Brzezinska, “Combined Matching Of 2d And 3d Kinect™ Data To Support Indoor Mapping And Navigation,” ASPRS 2014 Annual Conference, Louisville, Kentucky, USA. March 23-28, 2014, pp. 164-174.

[12] M. Wang, F. Mao, Y. Dai, J. Yao and S. Wang, “Real-time Measurement of Powerline Corridor by Fusing LiDAR Point Clouds and Monocular Camera Images,” 2022 International Conference on Intelligent Manufacturing, Advanced Sensing and Big Data (IMASBD), Guilin, China, 2022, pp. 64-68, doi: 10.1109/IMASBD57215.2022.00017.

[13] Y. Zhang, G. Tian, X. Shao and J. Cheng, “Effective Safety Strategy for Mobile Robots Based on Laser-Visual Fusion in Home Environments,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 52, no. 7, pp. 4138-4150, July 2022, doi: 10.1109/TSMC.2021.3090443.