Abstract

A specifically designed imaging system based on an acousto-optic tunable filter (AOTF) can integrate hyperspectral imaging and 3D reconstruction. As a result of the complicated optical structure, the AOTF imaging system deviates from the traditional pinhole model and lens distortion form, causing difficulty to achieve precise camera calibration. The influencing factors leading to the deviation are discussed and a multiplane model (MPM) is proposed with phase fringe to produce dense mark points and a back propagation neural network to obtain subpixel calibration. Experiments show that MPM can reduce the back projection error efficiently compared with the pinhole model. A 3D reconstruction process is conducted based on the calibration result to verify the feasibility of the proposed method.

© 2017 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Field imaging system for hyperspectral data, 3D structural data and panchromatic image data measurement based on acousto-optic tunable filter

Huijie Zhao, Ziye Wang, Guorui Jia, Xudong Li, and Ying Zhang
Opt. Express 26(13) 17717-17730 (2018)

Telecentric 3D profilometry based on phase-shifting fringe projection

Dong Li, Chunyang Liu, and Jindong Tian
Opt. Express 22(26) 31826-31835 (2014)

Real-time 3D shape measurement using 3LCD projection and deep machine learning

Hieu Nguyen, Nicole Dunne, Hui Li, Yuzeng Wang, and Zhaoyang Wang
Appl. Opt. 58(26) 7100-7109 (2019)

References

  • View by:
  • |
  • |
  • |

  1. J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
    [Crossref]
  2. S. Elsayed, H. Galal, A. Allam, and U. Schmidhalter, “Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits,” Sci. Hortic. (Amsterdam) 212, 136–147 (2016).
    [Crossref]
  3. R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
    [Crossref]
  4. T. Eckhard, E. M. Valero, J. Hernández-Andrés, and M. Schnitzlein, “Adaptive global training set selection for spectral estimation of printed inks using reflectance modeling,” Appl. Opt. 53(4), 709–719 (2014).
    [Crossref] [PubMed]
  5. H. Zhao, C. Li, and Y. Zhang, “Three-surface model for the ray tracing of an imaging acousto-optic tunable filter,” Appl. Opt. 53(32), 7684–7690 (2014).
    [Crossref] [PubMed]
  6. H. Zhao, P. Zhou, Y. Zhang, Z. Wang, and S. Shi, “Development of a dual-path system for band-to-band registration of an acousto-optic tunable filter-based imaging spectrometer,” Opt. Lett. 38(20), 4120–4123 (2013).
    [Crossref] [PubMed]
  7. P. Miraldo and H. Araujo, “Calibration of smooth camera models,” IEEE Trans. Pattern Anal. Mach. Intell. 35(9), 2091–2103 (2013).
    [Crossref] [PubMed]
  8. E. L. Hall, J. B. K. Tio, C. A. McPherson, and Sadjadi, “Measuring curved surfaces for robot vision,” Comput. J. 15(12), 42–54 (1982).
    [Crossref]
  9. O. D. Faugeras and G. Toscani, “The calibration problem for stereo,” in Proceedings of the IEEE Computer Vision and Pattern Recognition (1986), pp. 15–20.
  10. R. Y. Tsai, “An efficient and accurate camera calibration technique for 3D machine vision,” in Proceedings of the IEEE Computer Vision and Pattern Recognition (1986), pp. 364–374.
  11. J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14(10), 965–980 (1992).
    [Crossref]
  12. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
    [Crossref]
  13. S. Joaquim and A. Xavier, “A comparative review of camera calibrating methods with accuracy evaluation,” Pattern Recognit. 35(7), 1617–1635 (2002).
    [Crossref]
  14. M. Alemán-Flores, L. Alvarez, L. Gomez, P. Henriquez, and L. Mazorra, “Camera calibration in sport event scenarios,” Pattern Recognit. 47(1), 89–95 (2014).
    [Crossref]
  15. L. Heng, G. H. Lee, and M. Pollefeys, “Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicle,” Auton. Robots 39(3), 259–277 (2015).
    [Crossref]
  16. Z. Liu, Q. Wu, X. Chen, and Y. Yin, “High-accuracy calibration of low-cost camera using image disturbance factor,” Opt. Express 24(21), 24321–24336 (2016).
    [Crossref] [PubMed]
  17. H. A. Martins, J. R. Birk, and R. B. Kelley, “Camera models based on data from two calibration planes,” Comput. Graph. Image Process. 17(2), 173–180 (1981).
    [Crossref]
  18. G. Q. Wei and S. D. Ma, “A complete two-plane camera calibration method and experimental comparisons,” in Proceedings of the Fourth IEEE International Conference (1993), pp, 439 - 446.
    [Crossref]
  19. R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “Non-single viewpoint catadioptric cameras: geometry and analysis,” Int. J. Comput. Vis. 66(3), 211–229 (2006).
    [Crossref]
  20. Y. Yin, M. Wang, B. Z. Gao, X. Liu, and X. Peng, “Fringe projection 3D microscopy with the general imaging model,” Opt. Express 23(5), 6846–6857 (2015).
    [Crossref] [PubMed]
  21. Q. Hu and P. S. Huang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42(2), 482–493 (2003).
    [Crossref]
  22. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
    [Crossref]
  23. J. Xue, X. Su, L. Xiang, and W. Chen, “Using concentric circles and wedge grating for camera calibration,” Appl. Opt. 51(17), 3811–3816 (2012).
    [Crossref] [PubMed]
  24. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
    [Crossref]
  25. J. Y. Bouguet, “Camera Calibration Toolbox for Matlab”, (2015), http://www.vison.caltech.edu/bouguetj/calib_doc/

2016 (3)

S. Elsayed, H. Galal, A. Allam, and U. Schmidhalter, “Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits,” Sci. Hortic. (Amsterdam) 212, 136–147 (2016).
[Crossref]

Z. Liu, Q. Wu, X. Chen, and Y. Yin, “High-accuracy calibration of low-cost camera using image disturbance factor,” Opt. Express 24(21), 24321–24336 (2016).
[Crossref] [PubMed]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

2015 (3)

L. Heng, G. H. Lee, and M. Pollefeys, “Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicle,” Auton. Robots 39(3), 259–277 (2015).
[Crossref]

Y. Yin, M. Wang, B. Z. Gao, X. Liu, and X. Peng, “Fringe projection 3D microscopy with the general imaging model,” Opt. Express 23(5), 6846–6857 (2015).
[Crossref] [PubMed]

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

2014 (3)

2013 (2)

2012 (1)

2008 (1)

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

2006 (2)

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “Non-single viewpoint catadioptric cameras: geometry and analysis,” Int. J. Comput. Vis. 66(3), 211–229 (2006).
[Crossref]

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
[Crossref]

2003 (1)

Q. Hu and P. S. Huang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42(2), 482–493 (2003).
[Crossref]

2002 (1)

S. Joaquim and A. Xavier, “A comparative review of camera calibrating methods with accuracy evaluation,” Pattern Recognit. 35(7), 1617–1635 (2002).
[Crossref]

2000 (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
[Crossref]

1992 (1)

J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14(10), 965–980 (1992).
[Crossref]

1982 (1)

E. L. Hall, J. B. K. Tio, C. A. McPherson, and Sadjadi, “Measuring curved surfaces for robot vision,” Comput. J. 15(12), 42–54 (1982).
[Crossref]

1981 (1)

H. A. Martins, J. R. Birk, and R. B. Kelley, “Camera models based on data from two calibration planes,” Comput. Graph. Image Process. 17(2), 173–180 (1981).
[Crossref]

Alemán-Flores, M.

M. Alemán-Flores, L. Alvarez, L. Gomez, P. Henriquez, and L. Mazorra, “Camera calibration in sport event scenarios,” Pattern Recognit. 47(1), 89–95 (2014).
[Crossref]

Allam, A.

S. Elsayed, H. Galal, A. Allam, and U. Schmidhalter, “Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits,” Sci. Hortic. (Amsterdam) 212, 136–147 (2016).
[Crossref]

Alvarez, L.

M. Alemán-Flores, L. Alvarez, L. Gomez, P. Henriquez, and L. Mazorra, “Camera calibration in sport event scenarios,” Pattern Recognit. 47(1), 89–95 (2014).
[Crossref]

Anderson, J. E.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Araujo, H.

P. Miraldo and H. Araujo, “Calibration of smooth camera models,” IEEE Trans. Pattern Anal. Mach. Intell. 35(9), 2091–2103 (2013).
[Crossref] [PubMed]

Asundi, A.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Birk, J. R.

H. A. Martins, J. R. Birk, and R. B. Kelley, “Camera models based on data from two calibration planes,” Comput. Graph. Image Process. 17(2), 173–180 (1981).
[Crossref]

Blair, J.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Braswell, B.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Chen, Q.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Chen, W.

Chen, X.

Cohen, P.

J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14(10), 965–980 (1992).
[Crossref]

Dandois, J. P.

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

Dubayah, R.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Eckhard, T.

Ellis, E. C.

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

Elsayed, S.

S. Elsayed, H. Galal, A. Allam, and U. Schmidhalter, “Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits,” Sci. Hortic. (Amsterdam) 212, 136–147 (2016).
[Crossref]

Faugeras, O. D.

O. D. Faugeras and G. Toscani, “The calibration problem for stereo,” in Proceedings of the IEEE Computer Vision and Pattern Recognition (1986), pp. 15–20.

Galal, H.

S. Elsayed, H. Galal, A. Allam, and U. Schmidhalter, “Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits,” Sci. Hortic. (Amsterdam) 212, 136–147 (2016).
[Crossref]

Gao, B. Z.

Gomez, L.

M. Alemán-Flores, L. Alvarez, L. Gomez, P. Henriquez, and L. Mazorra, “Camera calibration in sport event scenarios,” Pattern Recognit. 47(1), 89–95 (2014).
[Crossref]

Grossberg, M. D.

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “Non-single viewpoint catadioptric cameras: geometry and analysis,” Int. J. Comput. Vis. 66(3), 211–229 (2006).
[Crossref]

Hall, E. L.

E. L. Hall, J. B. K. Tio, C. A. McPherson, and Sadjadi, “Measuring curved surfaces for robot vision,” Comput. J. 15(12), 42–54 (1982).
[Crossref]

Heng, L.

L. Heng, G. H. Lee, and M. Pollefeys, “Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicle,” Auton. Robots 39(3), 259–277 (2015).
[Crossref]

Henriquez, P.

M. Alemán-Flores, L. Alvarez, L. Gomez, P. Henriquez, and L. Mazorra, “Camera calibration in sport event scenarios,” Pattern Recognit. 47(1), 89–95 (2014).
[Crossref]

Hernández-Andrés, J.

Herniou, M.

J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14(10), 965–980 (1992).
[Crossref]

Hofton, M.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Holl, K. D.

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

Hu, Q.

Q. Hu and P. S. Huang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42(2), 482–493 (2003).
[Crossref]

Huang, L.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Huang, P. S.

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
[Crossref]

Q. Hu and P. S. Huang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42(2), 482–493 (2003).
[Crossref]

Joaquim, S.

S. Joaquim and A. Xavier, “A comparative review of camera calibrating methods with accuracy evaluation,” Pattern Recognit. 35(7), 1617–1635 (2002).
[Crossref]

Kelley, R. B.

H. A. Martins, J. R. Birk, and R. B. Kelley, “Camera models based on data from two calibration planes,” Comput. Graph. Image Process. 17(2), 173–180 (1981).
[Crossref]

Lee, G. H.

L. Heng, G. H. Lee, and M. Pollefeys, “Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicle,” Auton. Robots 39(3), 259–277 (2015).
[Crossref]

Li, C.

Liu, X.

Liu, Z.

Ma, S. D.

G. Q. Wei and S. D. Ma, “A complete two-plane camera calibration method and experimental comparisons,” in Proceedings of the Fourth IEEE International Conference (1993), pp, 439 - 446.
[Crossref]

Martin, M. E.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Martins, H. A.

H. A. Martins, J. R. Birk, and R. B. Kelley, “Camera models based on data from two calibration planes,” Comput. Graph. Image Process. 17(2), 173–180 (1981).
[Crossref]

Mazorra, L.

M. Alemán-Flores, L. Alvarez, L. Gomez, P. Henriquez, and L. Mazorra, “Camera calibration in sport event scenarios,” Pattern Recognit. 47(1), 89–95 (2014).
[Crossref]

McPherson, C. A.

E. L. Hall, J. B. K. Tio, C. A. McPherson, and Sadjadi, “Measuring curved surfaces for robot vision,” Comput. J. 15(12), 42–54 (1982).
[Crossref]

Miraldo, P.

P. Miraldo and H. Araujo, “Calibration of smooth camera models,” IEEE Trans. Pattern Anal. Mach. Intell. 35(9), 2091–2103 (2013).
[Crossref] [PubMed]

Nadwodny, D.

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

Nayar, S. K.

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “Non-single viewpoint catadioptric cameras: geometry and analysis,” Int. J. Comput. Vis. 66(3), 211–229 (2006).
[Crossref]

Peng, X.

Plourde, L. C.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Pollefeys, M.

L. Heng, G. H. Lee, and M. Pollefeys, “Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicle,” Auton. Robots 39(3), 259–277 (2015).
[Crossref]

Reid, J. L.

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

Sadjadi,

E. L. Hall, J. B. K. Tio, C. A. McPherson, and Sadjadi, “Measuring curved surfaces for robot vision,” Comput. J. 15(12), 42–54 (1982).
[Crossref]

Schmidhalter, U.

S. Elsayed, H. Galal, A. Allam, and U. Schmidhalter, “Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits,” Sci. Hortic. (Amsterdam) 212, 136–147 (2016).
[Crossref]

Schnitzlein, M.

Shi, S.

Smith, M.

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Su, X.

Swaminathan, R.

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “Non-single viewpoint catadioptric cameras: geometry and analysis,” Int. J. Comput. Vis. 66(3), 211–229 (2006).
[Crossref]

Tio, J. B. K.

E. L. Hall, J. B. K. Tio, C. A. McPherson, and Sadjadi, “Measuring curved surfaces for robot vision,” Comput. J. 15(12), 42–54 (1982).
[Crossref]

Toscani, G.

O. D. Faugeras and G. Toscani, “The calibration problem for stereo,” in Proceedings of the IEEE Computer Vision and Pattern Recognition (1986), pp. 15–20.

Tsai, R. Y.

R. Y. Tsai, “An efficient and accurate camera calibration technique for 3D machine vision,” in Proceedings of the IEEE Computer Vision and Pattern Recognition (1986), pp. 364–374.

Valero, E. M.

Wang, M.

Wang, Z.

Wei, G. Q.

G. Q. Wei and S. D. Ma, “A complete two-plane camera calibration method and experimental comparisons,” in Proceedings of the Fourth IEEE International Conference (1993), pp, 439 - 446.
[Crossref]

Weng, J.

J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14(10), 965–980 (1992).
[Crossref]

Wu, Q.

Xavier, A.

S. Joaquim and A. Xavier, “A comparative review of camera calibrating methods with accuracy evaluation,” Pattern Recognit. 35(7), 1617–1635 (2002).
[Crossref]

Xiang, L.

Xue, J.

Yin, Y.

Zahawi, R. A.

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

Zhang, M.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Zhang, S.

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
[Crossref]

Zhang, Y.

Zhang, Z.

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
[Crossref]

Zhao, H.

Zhou, P.

Zuo, C.

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Appl. Opt. (3)

Auton. Robots (1)

L. Heng, G. H. Lee, and M. Pollefeys, “Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicle,” Auton. Robots 39(3), 259–277 (2015).
[Crossref]

Biol. Conserv. (1)

R. A. Zahawi, J. P. Dandois, K. D. Holl, D. Nadwodny, J. L. Reid, and E. C. Ellis, “Using lightweight unmanned aerial vehicles to monitor tropical forest recovery,” Biol. Conserv. 186, 287–295 (2015).
[Crossref]

Comput. Graph. Image Process. (1)

H. A. Martins, J. R. Birk, and R. B. Kelley, “Camera models based on data from two calibration planes,” Comput. Graph. Image Process. 17(2), 173–180 (1981).
[Crossref]

Comput. J. (1)

E. L. Hall, J. B. K. Tio, C. A. McPherson, and Sadjadi, “Measuring curved surfaces for robot vision,” Comput. J. 15(12), 42–54 (1982).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (3)

J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14(10), 965–980 (1992).
[Crossref]

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000).
[Crossref]

P. Miraldo and H. Araujo, “Calibration of smooth camera models,” IEEE Trans. Pattern Anal. Mach. Intell. 35(9), 2091–2103 (2013).
[Crossref] [PubMed]

Int. J. Comput. Vis. (1)

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “Non-single viewpoint catadioptric cameras: geometry and analysis,” Int. J. Comput. Vis. 66(3), 211–229 (2006).
[Crossref]

Opt. Eng. (2)

Q. Hu and P. S. Huang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42(2), 482–493 (2003).
[Crossref]

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
[Crossref]

Opt. Express (2)

Opt. Lasers Eng. (1)

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Opt. Lett. (1)

Pattern Recognit. (2)

S. Joaquim and A. Xavier, “A comparative review of camera calibrating methods with accuracy evaluation,” Pattern Recognit. 35(7), 1617–1635 (2002).
[Crossref]

M. Alemán-Flores, L. Alvarez, L. Gomez, P. Henriquez, and L. Mazorra, “Camera calibration in sport event scenarios,” Pattern Recognit. 47(1), 89–95 (2014).
[Crossref]

Remote Sens. Environ. (1)

J. E. Anderson, L. C. Plourde, M. E. Martin, B. Braswell, M. Smith, R. Dubayah, M. Hofton, and J. Blair, “Integrating waveform lidar with hyperspectral imagery for inventory of a northern temperate forest,” Remote Sens. Environ. 112(4), 1856–1870 (2008).
[Crossref]

Sci. Hortic. (Amsterdam) (1)

S. Elsayed, H. Galal, A. Allam, and U. Schmidhalter, “Passive reflectance sensing and digital image analysis for assessing quality parameters of mango fruits,” Sci. Hortic. (Amsterdam) 212, 136–147 (2016).
[Crossref]

Other (4)

O. D. Faugeras and G. Toscani, “The calibration problem for stereo,” in Proceedings of the IEEE Computer Vision and Pattern Recognition (1986), pp. 15–20.

R. Y. Tsai, “An efficient and accurate camera calibration technique for 3D machine vision,” in Proceedings of the IEEE Computer Vision and Pattern Recognition (1986), pp. 364–374.

G. Q. Wei and S. D. Ma, “A complete two-plane camera calibration method and experimental comparisons,” in Proceedings of the Fourth IEEE International Conference (1993), pp, 439 - 446.
[Crossref]

J. Y. Bouguet, “Camera Calibration Toolbox for Matlab”, (2015), http://www.vison.caltech.edu/bouguetj/calib_doc/

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (21)

Fig. 1
Fig. 1 Photo (a) and schematic (b) of AOTF-based 3D measurement system.
Fig. 2
Fig. 2 Schematic (a) and mathematical expression (b) of the pinhole model.
Fig. 3
Fig. 3 Simulation result of the traditional image distortion.
Fig. 4
Fig. 4 Sketch map of the AOTF imaging system.
Fig. 5
Fig. 5 (a) Beam of converging light ray. (b) After the planar mirror reflection, the light ray still intersects at one point and maintains the axial symmetry. (c) After the parabolic mirror reflection, the axial symmetry is destroyed and the light rays will not intersect at one point.
Fig. 6
Fig. 6 Asymmetric refraction in the wedge.
Fig. 7
Fig. 7 Simulation result of the image distortion.
Fig. 8
Fig. 8 Multiplane imaging model.
Fig. 9
Fig. 9 (a) Displayed fringe patterns in vertical direction with the three fringe pitches of 16 pixels, 17 pixels and 18 pixels respectively; (b) Displayed fringe patterns in horizontal direction with the three fringe pitches; (c) Acquired fringe images in vertical direction by the AOTF imaging system with the three fringe pitches; (d) Acquired fringe images in horizontal direction by the AOTF imaging system with the three fringe pitches; (e) The result of the absolute phase calculation in horizontal and vertical direction.
Fig. 10
Fig. 10 The lookup table between the pixel coordinate and the world coordinate.
Fig. 11
Fig. 11 The structure of the BP neural network.
Fig. 12
Fig. 12 Linear equation of the space line is acquired with the mapping set Γ.
Fig. 13
Fig. 13 Laser planes are produced by the laser and optical scanner.
Fig. 14
Fig. 14 Calibration for the laser plane Km.
Fig. 15
Fig. 15 Calibration result of the AOTF imaging system (a) and Basler camera (b) under the pinhole model.
Fig. 16
Fig. 16 Calibration result of the AOTF imaging system (a) and Basler camera (b) under MPM.
Fig. 17
Fig. 17 3D structure measurement with (a) plane and (b) seedling.
Fig. 18
Fig. 18 Residuals of the plane fitting using the pinhole model (a) and MPM (b).
Fig. 19
Fig. 19 The standard stepped sample (a) and the calculation of the distance between the working surfaces (b).
Fig. 20
Fig. 20 3D point cloud of the two working surfaces using the pinhole model (a) and MPM (b).
Fig. 21
Fig. 21 Photo (a) and 3D point cloud after triangulation (b) of maize seedling.

Tables (2)

Tables Icon

Table 1 Back projection errors of the calibration

Tables Icon

Table 2 numerical performance of the BP neural network

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

{ x = f X c / Z c y = f Y c / Z c ,
x x 0 i = 0 n λ i X i = y y 0 i = 0 n λ i Y i = z z 0 i = 0 n λ i Z i ,
x x 0 X k = y y 0 Y k = z z 0 Z k ,
δ x ( x , y ) = δ x 1 + δ x 2 + δ x 3 = k 1 x r 2 + 2 p 1 x y + p 2 2 ( r 2 + 2 x 2 ) + s 1 r 2 δ y ( x , y ) = δ y 1 + δ y 2 + δ y 3 = k 2 y r 2 + 2 p 2 x y + p 1 2 ( r 2 + 2 y 2 ) + s 2 r 2 , δ = δ x 2 ( x , y ) + δ y 2 ( x , y )
L : { x = m t + x 0 y = n t + y 0 z = p t + z 0 ,
Γ 0 : P 0 ( u 0 , v 0 ) L : { x = m t + x 0 y = n t + y 0 z = p t + z 0 Γ : { Γ 1 : P 0 ( u 0 , v 0 ) P 1 ( x 1 , y 1 ) Γ 2 : P 0 ( u 0 , v 0 ) P 2 ( x 2 , y 2 ) ... Γ n : P 0 ( u 0 , v 0 ) P n ( x n , y n ) } ,
I i ( u , v ) = I B ( u , v ) + I A ( u , v ) cos [ φ ( u , v ) + i × π / 2 ] ,
φ ( u , v ) = arc tan I 3 ( u , v ) I 1 ( u , v ) I 0 ( u , v ) I 2 ( u , v ) ,
λ 1 λ 2 λ 3 λ 1 λ 2 2 λ 1 λ 3 + λ 2 λ 3 max ( R x , R y ) ,
{ x = k x × ϕ x ( u , v ) × λ / 2 π y = k y × ϕ y ( u , v ) × λ / 2 π ,
F n : F n ( u , v ) = ( x , y ) ; ( ( u , v ) Π 0 , ( x , y ) Π n ) ,
{ ( x 1 , y 1 ) = F 1 ( u , v ) ; z 1 = 0 ( x 2 , y 2 ) = F 2 ( u , v ) ; z 2 = d ... ( x n , y n ) = F n ( u , v ) ; z n = ( n 1 ) d ,
A 1 = [ 5987.23 0 234.35 0 6092.35 625.34 0 0 1 ] , k 1 = 0.0675 , k 2 = 158.9546 , p 1 = 0.0374 , p 2 = 0.0235
A 2 = [ 6479.26 0 401.46 0 6501.38 272.44 0 0 1 ] , k 1 = 0.0259 , k 2 = 8.7796 ; p 1 = 0.0057 , p 2 = 0.0231

Metrics