Non parametric methods of disparity computation

  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract

    Disparity is inversely proportional to depth. Informationabout depth is a key factor in many real time applicationslikecomputer vision applications, medical diagnosis, model precision etc. Disparity is measured first in order to calculate the depth that suitsthe real world applications. There are two approaches viz., active and passive methods. Due to its cost effectiveness, passive approach is the most popular approach. In spite of this, the measures arelimited by its occlusion, more number of objects and texture areas. So, effective and efficient stereo depth estimation algorithms have taken the toll on the researchers. Theimportant goal of stereo vision algorithms is the disparity map calculation between twoimages clicked the same time. These pictures are taken using two cameras. We have implemented the non-parametric algorithmsfor stereo vision viz., Rank and Census transform in both single processor and multicore processors are implemented andthe results showsits time efficient by 1500 times.

  • Keywords

    Stereo Image; Disparity; Depth; Non Parametric.

  • References

      [1] Scharstein, D., &Szeliski, R. (2002), “A taxonomy and evalua-tion of dense two-frame stereo correspondence algorithms” In-ternational Journal of Computer Vision, 47(1-3), 7–42.

      [2] Jana Kostkov´a and Radim ˇS´ara,” Automatic Disparity Search Range Estimation for Stereo Pairs of Unknown Scenes”,Center for Machine Perception, FEE, Czech Technical University Kar-lovon´am. 13, 121 35 Prague, Czech Republic

      [3] Schuon, S., Theobalt, Ch., Davis, J., &Thrun, S. (2008).” High-quality scanning using time-offlight depth super resolution” In: IEEE CVPR Workshop on Time-Of-Flight Computer Vision 2008, pp. 1-7

      [4] R. Szeliski and R. Zabih, “An experimental comparison of ste-reo algorithms. In International Workshop on Vision Algo-rithmsKerkyra, Greece, 1999. Springer,”, pages 1–19

      [5] C. L. Zitnick and T. Kanade.” A cooperative algorithm for stereo matching and occlusion detection”. IEEE TPAMI, 22(7) 2000, pgs 675–684,

      [6] S. Birchfield and C. Tomasi.,” Depth discontinuities by pixelto-pixel stereo”. In ICCV, 1998, pages 1073–1080,

      [7] Ozden, K. E., Schindler, K., and van Gool, L. (2007).” Simulta-neous Segmentation and 3D Reconstruction of Monocular Im-age Sequences. Computer Vision”, 2007. ICCV 2007. IEEE 11th International Conference on, pp. 1-8.

      [8] Helmi, F. S. & Scherer, S. (2001)” Adaptive Shape from Focus with an Error Estimation in Light Microscopy”, 2nd Int'l Sym-posium on Image and Signal Processing and Analysis, pp. 188-193.

      [9] Nalpantidis, L., Chrysostomou, D., &Gasteratos, A. (2009, De-cember). “Obtaining reliable depth maps for robotic applica-tions with a quad-camera system. In International Conference on Intelligent Robotics and Applications” (vol. 5928, p. 906-916). Singapore: Springer-Verlag

      [10] T. Kanade.” Development of a video-rate stereo matching”. In Image Understanding Workshop, Monterey, CA, 1994. Mor-gan Kaufmann Publishers, pages549–557.

      [11] M. J. Hannah. “Computer Matching of Areas in Stereo Images”. PhD thesis, Stanford University, 1974

      [12] T. W. Ryan, R. T. Gray, and B. R. Hunt. “Prediction of correla-tion errors in stereo-pair images”. Optical Engineering, 1980,19(3):312–322,

      [13] Viral H. Borisagar, Mukesh A. Zaveri, Disparity Map Genera-tion from Illumination Variant Stereo Images Using Efficient Hierarchical Dynamic Programming”,The Scientific World Journal, Volume 2014 (2014), Article ID 513417, 12 pages

      [14] Haesol Park, Kyoung Mu Lee, “Joint Estimation of camera pose, Depth, Deblurring and super Resolution from a blurred image sequence”, 2017 IEEE Internation conference on Computer vi-sion 2380-7504/17$31.00© IEEE pgs 4623-4631.

      [15] Adrian Leu, Dan Bacără, IoanJiveţ,” Disparity Map Computa-tion Speed Comparisonfor CPU, GPU and FPGA Implementa-tions”, Tom 55(69), Fascicola 2, 2010, pgs 7-12.




Article ID: 10062
DOI: 10.14419/ijet.v7i2.6.10062

Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.