• 2024 (Vol.38)
  • 1990 (Vol.4)
  • 1989 (Vol.3)
  • 1988 (Vol.2)
  • 1987 (Vol.1)

Edge detection based mobile robot indoor localization

© 2019 M. P. Abramov, O. S. Shipitko, A. S. Lukoyanov, E. I. Panfilova, I. A. Kunina, A. S. Grigoryev

Institute for Information Transmission Problems “Kharkevich Institute” RAS, Moscow, Russia
Institute for Systems Analysis, Federal Research Center “Computer Science and Control” of Russian Academy of Sciences, Moscow, Russia
Moscow Institute of Physics and Technology (State University), Dolgoprudny, Russia

Received 17 Sep 2018

In this paper, we present the precise indoor positioning system for mobile robot pose estimation based on visual edge detection. The set of onboard motion sensors (i.e. wheel speed sensor and yaw rate sensor) is used for pose prediction. A schematic plan of the building, stored as a multichannel raster image, is used as a prior information. The pose likelihood estimation is performed via matching of edges, detected on the optical image, against the map. Therefore, the proposed method does not require any deliberate building infrastructure changes and makes use of the inherent features of man-made structures – edges between walls and floor. The particle filter algorithm is applied in order to integrate heterogeneous localization data (i.e. motion sensors and detected visual features). Since particle filter uses probabilistic sensor models for state estimation, the precise measurement noise modeling is key to positioning quality enhancement. The probabilistic noise model of the edge detector, combining geometrical detection noise and false positive edge detection noise, is proposed in this work. Developed localization system was experimentally evaluated on the car-like mobile robot in the challenging environment. Experimental results demonstrate that the proposed localization system is able to estimate the robot pose with a mean error not exceeding 0.1 m on each of 100 test runs.

Key words: indoor localization, positioning system, edge detection, noise model, sensor model, particle filter, mobile robot

DOI: 10.1134/S0235009219010025

Cite: Abramov M. P., Shipitko O. S., Lukoyanov A. S., Panfilova E. I., Kunina I. A., Grigoryev A. S. Sistema pozitsionirovaniya vnutri zdanii mobilnoi robototekhnicheskoi platformy na osnove detektsii kraev [Edge detection based mobile robot indoor localization]. Sensornye sistemy [Sensory systems]. 2019. V. 33(1). P. 30-43 (in Russian). doi: 10.1134/S0235009219010025

References:

  • Panfilova E.I., Kunina I.A. Detektirovanie linii razmetki v zadache raspoznavaniya dorozhnoi razmetki [Line segments detection in road marking recognition task]. Trudy 60-i Vserossiiskoi nauchnoi konferentsii MFTI [Proceedings of the 60th All-Russian MIPT Scientific Conference]. MIPT, 2017. P. 271–273 (in Russian).
  • Canny J. A computational approach to edge detection. IEEE Transactions on pattern analysis and machine intelligence. 1986. P. 679–698.
  • Charmette B., Royer E., Chausse F. Vision-based robot localization based on the efficient matching of planar features. Machine Vision and Applications. 2016. V. 27 (4). P. 415–436.
  • Chen X., Jia Y. Indoor localization for mobile robots using lampshade corners as landmarks: Visual system calibration, feature extraction and experiments. International Journal of Control, Automation and Systems. 2014. V. 12 (6). P. 1313–1322.
  • Dellaert F., Fox F., Burgard W., Thrun S. Monte carlo localization for mobile robots. Robotics and Automation, 1999. Proceedings. 1999 IEEE International Conference on. IEEE, 1999. V. 2. P. 1322–1328.
  • Ershov E., Terekhin A., Nikolaev D., Postnikov V., Karpenko S. Fast hough transform analysis: pattern deviation from line segment. Eighth International Conference on Machine Vision (ICMV 2015). International Society for Optics and Photonics, 2015. V. 9875. P. 987509.
  • Fedorenko F.A., Ivanova A.A., Limonova E.E., Konovalenko I.A. Trainable siamese keypoint descriptors for real-time applications. 2016 International Conference on Robotics and Machine Vision. International Society for Optics and Photonics, 2017. V. 10253. P. 1025306.
  • Floyd R.W. Algorithm 97: shortest path. Communications of the ACM. 1962. V. 5 (6). P. 345.
  • Heredia M., Endres F., Burgard W., Sanz R. Fast and robust feature matching for rgb-d based localization. arXiv preprint arXiv:1502.00500. 2015.
  • Jo K., Jo Y., Suhr J.K., Jung H.G., Sunwoo M. Precise localization of an autonomous car based on probabilistic noise models of road surface marker features using multiple cameras. IEEE Trans. Intelligent Transportation Systems. 2015. V. 16 (6). P. 3377–3392.
  • Kim G., Eom J., Park Y. Investigation on the occurrence of mutual interference between pulsed terrestrial lidar scanners. Intelligent Vehicles Symposium (IV), 2015 IEEE. IEEE, 2015a. P. 437–442. (a)
  • Kim H., Lee D., Oh T., Choi H., Myung H. A probabilistic feature map-based localization system using a monocular camera. Sensors. 2015b. V. 15 (9). P. 21636–21659. (b)
  • Konovalenko I.A., Miller A.B., Miller B.M., Nikolaev D.P. Uav navigation on the basis of the feature points detection on underlying surface. ECMS. 2015. P. 499–505.
  • Leung K.Y.K. Monocular vision based particle filter localization in urban environments. Master’s thesis. University of Waterloo, 2007. P. 10.
  • Mur-Artal R., Tardós J.D. Visual-inertial monocular slam with map reuse. IEEE Robotics and Automation Letters. 2017. V. 2 (2). P. 796–803.
  • Nagy C., Biró-Ambrus Z., Lőrinc Márton. Ultrasoundbased indoor robot localization using ambient temperature compensation. Acta Universitatis Sapientiae Electrical and Mechanical Engineering. 2016. V. 8 (1). P. 19–28.
  • Perez-Grau F.J., Fabresse F.R., Caballero F., Viguria A., Ollero A. Long-term aerial robot localization based on visual odometry and radio-based ranging. Unmanned Aircraft Systems (ICUAS), 2016 International Conference on. IEEE, 2016. P. 608–614.
  • Piciarelli C. Visual indoor localization in known environments. IEEE Signal Processing Letters. 2016. V. 23 (10). P. 1330–1334.
  • Röfer T., Jüngel M. Fast and robust edge-based localization in the sony four-legged robot league. Robot Soccer World Cup. Springer, 2003. P. 262–273.
  • Shipitko O., Grigoryev A. Ground vehicle localization with particle filter based on simulated road marking image. European Conference on Modelling and Simulation. Wilhelmshaven, Germany. 2018. P. 341–347.
  • Thrun S. Particle filters in robotics. Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence. Morgan Kaufmann Publishers Inc., 2002. P. 511–518.
  • Valencia R., Andrade-Cetto J. Active pose slam. Mapping, Planning and Exploration with Pose SLAM. Springer, 2018. P. 89–108.
  • Vlassis N., Motomura Y., Hara I., Asoh H., Matsui T. Edge-based features from omnidirectional images for robot localization. Proceedings of the 2001 IEEE International Conference on Robotics and Automation. 2001. V. 2. P. 1579–1584.
  • Wang C.M. Location estimation and uncertainty analysis for mobile robots. Proceedings of the 1988 IEEE International Conference on Robotics and Automation. 1988. P. 1231–1235.
  • Yang Y., Yang G., Zheng T., Tian Y., Li L. Feature extraction method based on 2.5-dimensions lidar platform for indoor mobile robots localization. Proceedings of the 2017 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM). 2017. P. 736–741.