Journal Home > Volume 24 , Issue 3

The centroid location of a near infrared star always deviates from the real center due to the effects of surrounding radiation. To determine a more accurate center of a near infrared star, this paper proposes a method to detect the star’s saliency area and calculate the star’s centroid via the pixels only in this area, which can greatly decrease the effect of the radiation. During saliency area detection, we calculated the boundary connectivity and gray similarity of every pixel to estimate how likely it was to be a background pixel. Aiming to simplify and speed up the calculation process, we divided the near infrared starry sky image into super pixel maps at multi-scale by Simple Linear Iterative Clustering (SLIC). Second, we detected the saliency map for every super pixel map of the image. Finally, we fused the saliency maps according to a weighted coefficient that is determined by the least square method. For the images used in our experiment, we set the multi-scale super pixel numbers to 100, 150, and 200. The results show that our method can obtain an offset variance of less than 0.27 for the center coordinates compared to the labelled centers.


menu
Abstract
Full text
Outline
About this article

Near Infrared Star Centroid Detection by Area Analysis of Multi-Scale Super Pixel Saliency Fusion Map

Show Author's information Xiaohu Yuan( )Shaojun GuoChunwen LiBin LuShuli Lou
Department of Automation, Tsinghua University, Beijing 100084, China.
National Innovation Institute of Defense Technology, Beijing 100091, China.
Naval Aeronautical University, Yantai 264001, China.
Yantai University, Yantai 264005, China.

Abstract

The centroid location of a near infrared star always deviates from the real center due to the effects of surrounding radiation. To determine a more accurate center of a near infrared star, this paper proposes a method to detect the star’s saliency area and calculate the star’s centroid via the pixels only in this area, which can greatly decrease the effect of the radiation. During saliency area detection, we calculated the boundary connectivity and gray similarity of every pixel to estimate how likely it was to be a background pixel. Aiming to simplify and speed up the calculation process, we divided the near infrared starry sky image into super pixel maps at multi-scale by Simple Linear Iterative Clustering (SLIC). Second, we detected the saliency map for every super pixel map of the image. Finally, we fused the saliency maps according to a weighted coefficient that is determined by the least square method. For the images used in our experiment, we set the multi-scale super pixel numbers to 100, 150, and 200. The results show that our method can obtain an offset variance of less than 0.27 for the center coordinates compared to the labelled centers.

Keywords: saliency, near infrared, starry star, Simple Linear Iterative Clustering (SLIC)

References(16)

[1]
Ye S., Fang Y. H., Sun X. B., and Hong J., Star observation based on polarization information in daytime, (in Chinese), J. Atmos. Environ. Opt., vol. 2, no. 3, pp. 222-226, 2017.
[2]
Han Y. L., Wang D., Zhang J., Fan L. H., and Sun T. F., Analysis of star detection with multi-field near-infrared during daytime, (in Chinese), Infrared Laser Eng., vol. 42, no. 8, pp. 2202-2208, 2013.
[3]
Gao J. Y., Chen L., Wang J. J., Hou J. L., and Zhao J. L., A brief introduction on the significance and achievements of 2MASS, (in Chinese), Prog. Astron., vol. 22, no. 4, pp. 275-283, 2004.
[4]
Zhang C. H., Zhang H. L., Yuan B., and Hu L., Estimation of star-sky image background and its application on HDR image enhancement, (in Chinese), Journal of Telemetry, Tracking and Command, vol. 34, no. 4, pp. 24-27&38, 2013.
[5]
Lu B., Guo S. J., and Wang D., Centroid detection of targets on near infrared stellar map based on C spline method and gauss method comprision, (in Chinese), Electro-Optic Technol. Appl., vol. 26, no. 5, pp. 59-63, 2011.
[6]
Wang Z. and Guo S. J., Saliency analysis of near infrared star images and detection of fixed stars, (in Chinese), Opt. Precision Eng., vol. 25, no. 6, pp. 1652-1661, 2017.
[7]
Cheng M. M., Zhang G. X., Mitra N. J., Huang X. L., and Hu S. M., Global contrast based salient region detection, in Proc. CVPR 2011, Colorado Springs, CO, USA, 2011, pp. 409-416.
DOI
[8]
Goferman S., Zelnik-Manor L., and Tal A., Context-aware saliency detection, IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 10, pp. 1915-1926, 2012.
[9]
Guo S. J., Lou S. L., and Liu F., Multi-ship saliency detection via patch fusion by color clustering, (in Chinese), Opt. Precision Eng., vol. 24, no. 7, pp. 1807-1817, 2016.
[10]
Perazzi F., Krähenbühl P., Pritch Y., and Hornung A., Saliency filters: Contrast based filtering for salient region detection, in Proc. 2012 IEEE Conf. Computer Vision and Pattern Recognition, Providence, RI, USA, 2012, pp. 733-740.
DOI
[11]
Yan Q., Xu L., Shi J. P., and Jia J. Y., Hierarchical saliency detection, in Proc. 2013 IEEE Conf. Computer Vision and Pattern Recognition, Portland, OR, USA, 2013, pp. 1155-1162.
DOI
[12]
Achanta R., Shaji A., Smith K., Lucchi A., Fua P., and Süsstrunk S., SLIC superpixels compared to state-of-the-art superpixel methods, IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 11, pp. 2274-2282, 2012.
[13]
Zhu W. J., Liang S., Wei Y. C., and Sun J., Saliency optimization from robust background detection, in Proc. 2014 IEEE Conf. Computer Vision and Pattern Recognition, Columbus, OH, USA, 2014, pp. 2814-2821.
DOI
[14]
Yang C., Zhang L. H., Lu H. C., Ruan X., and Yang M. H., Saliency detection via graph-based manifold ranking, in Proc. 2013 IEEE Conf. Computer Vision and Pattern Recognition, Portland, OR, USA, 2013, pp. 3166-3173.
DOI
[15]
Wei Y. C., Wen F., Zhu W. J., and Sun J., Geodesic saliency using background priors, in Proc. 12th European Conf. Computer Vision, Florence, Italy, 2012, pp. 29-42.
DOI
[16]
Cheng M. M., Warrell J., Lin W. Y., Zheng S., Vineet V., and Crook N., Efficient salient region detection with soft image abstraction, in Proc. 2013 IEEE Int. Conf. Computer Vision, Sydney, Australia, 2013, pp. 1529-1536.
DOI
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 28 September 2017
Revised: 24 December 2017
Accepted: 25 December 2017
Published: 24 January 2019
Issue date: June 2019

Copyright

© The author(s) 2019

Acknowledgements

This research was supported in part by the National Natural Science Foundation of China (No. 61303192).

Rights and permissions

Return