Comprehensive study and investigation of ROS for computer vision applications using Raspberry Pi

  • Authors

    • Jignesh Patoliya Charotar university of science & technology
    • Hiren Mewada Charotar university of science & technology
    2019-08-25
    https://doi.org/10.14419/ijet.v8i3.29694
  • Object tracking, Robot operating system, raspberry Pi
  • The machine in the form of the robots has a very large community which makes impressive progress in recent trends. Progressive examples of these types of robots are land based mobile robots, quadcopters, humanoid, etc. Motion tracking and object recognition is the base process in major robotic applications. For better flexibility and integration of robots with video processing applications, the ROS framework is largely used. The major issue with ROS is its latency and integrity. This paper investigates the integration of the ROS framework with OpenCV libraries on the Raspberry PI processor for video processing applications. In the proposed experiment setup, a camera node interfaced with the raspberry PI captures images and publishes it in ROS message form on a specific topic. The subscriber node converts ROS message into an image using cvbridge. Converted image is processed again using OpenCV library on the raspberry Pi board. The extracted information can be used to actuate peripheral devices interfaced with the raspberry Pi. An investigation of the raspberry Pi based implementation reveals that ROS introduces 0.63% overhead and optimum implementation on raspberry Pi can avoid the high configured computer and raspbeery Pi can process the video at 13 frames per second at most.

  • References

    1. [1] Kramer, J., Scheutz, M, “Development environments for autonomous mobile robots: A surveyâ€, Autonomous Robot, Vol.22, No.2, (2007), pp.101-132.

      [2] Weintrop, D., Afzal, A., Salac, J., Francis, P., Li, B., Shepherd, D. C., Franklin, D. , “Evaluating coblox: A comparative study of robotics programming environments for adult novicesâ€, In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, (2018), pp.366-377.

      [3] Mobile Robot Programming Toolkit available at https://www.mrpt.org/

      [4] Microsoft Robotics Developer Studio 4 available at www.microsoft.com/en-in/download/details.aspx?id=29081

      [5] S. Carpin, M. Lewis, J. Wang, S. Balakirsky and C. Scrapper, “USARSim: a robot simulator for research and educationâ€, Proceedings 2007 IEEE International Conference on Robotics and Automation, Roma, 2007, pp.1400-1405.

      [6] SARGE, Available athttp://sarge.sourceforge.net/page11/page11.html

      [7] Robot operating system available at http://www.ros.org/about-ros/

      [8] ROS conceptual flow available at ref http://gtms1318.wordpress.com/

      [9] Mishra, R., Javed, A. , “ROS based service robot platformâ€, In 2018 4th International Conference on Control, Automation and Robotics (ICCAR) , Vol.22, No.2, (2018), pp.55-59.

      [10] Simoens, Pieter, Mauro Dragone, and Alessandro Saffiotti , “The Internet of Robotic Things: A review of concept, added value and applicationsâ€, International Journal of Advanced Robotic Systems, Vol.15, No.1, (2018), 1729881418759424.

      [11] Greenberg, Rebecca, “A. Investigating the feasibility of conducting human tracking and following in an indoor environment using a Microsoft Kinect and the Robot Operating System.â€, Naval Postgraduate School Monterey United States,(2017).

      [12] Goldhoorn, A., Garrell, A., Alqu´ezar, R.,Sanfeliu, A. , “Searching and tracking people with cooperative mobile ro-bots. Autonomous Robotsyâ€, Autonomous Robot, Vol.42, No.4, (2007), pp.739-759.

      [13] Fang, F., Qian, K., Zhou, B., Ma, X., “Research and Implementation of Person Tracking Method Based on Multi-feature Fusionâ€, In International Conference on Intelligent Robotics and Applications, (2017), pp.141-153, Springer, Cham.

      [14] Bisi, S., De Luca, L., Shrestha, B., Yang, Z. and Gandhi, V. , “Development of an EMG-Controlled Mobile Robotâ€, Robotics,, Vol.7, No.3, (2018), pp.36-42.

      [15] Krishna, BV Santhosh, J. Oviya, S. Gowri, and M. Varshini, “Cloud robotics in industry using raspberry Piâ€, In 2016 Second International Conference on Science Technology Engineering and Management (ICONSTEM), (2016), pp.543-547.

      [16] GauthamPonnu and Jacob George, “Real-time ROSberryPi SLAM Robot .â€, Thesis of Master of Engineering, School of Electrical and Computer Engineering of Cornell University,(2016).

      [17] Peel, H., Luo, S., Cohn, A. and Fuentes, “R. An improved robot for bridge inspectionâ€, In Proceedings of the 34th ISARC, 2017, pp.663-670.

      [18] Lentin Joseph , “Mastering of ROS for robotics programming â€, Packt publishing, 2015, chapter 8.

      [19] Adrian Rosebrock,â€Pedestrian Detection OpenCV code†https://www.pyimagesearch.com/2015/11/09/pedestrian-detection-OpenCV/ , 2015.

      [20] James Bowman â€Face detection code†available at ref http://docs.ros.org/jade/api/OpenCV_tests/html/rosfacedetect_8py_source.html

      [21] â€Ball tracking code†available at ref http://www.robot-home.it/blog/en/software/ball-tracker-con-filtro-di-kalman/

      [22] â€Face recognition using Local binary Pattern†available at ref https://www.pyimagesearch.com/2018/06/25/raspberry-pi-face-recognition/

      [23] Campmany, V., Silva, S., Espinosa, A., Moure, J. C., V´azquez, D., and L´opez, A. M. “Development environments for autonomous mobile robots: A surveyâ€, GPU-based pedestrian detection for autonomous driving, Vol.80, (2016), pp.2377-2381.

  • Downloads

  • How to Cite

    Patoliya, J., & Mewada, H. (2019). Comprehensive study and investigation of ROS for computer vision applications using Raspberry Pi. International Journal of Engineering & Technology, 8(3), 261-269. https://doi.org/10.14419/ijet.v8i3.29694