Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Facial pore segmentation results can provide reliable evidence to simulate post-product pore conditions and provide product recommendations. However, accurately segmenting pores is challenging due to their small size, weak boundaries and dense distribution. It is also difficult to acquire precise annotation. Therefore, we formulate pore segmentation as a two-stage, weakly supervised task using both traditional and deep learning methods without human annotation. We propose a novel method called the pore segmentation network (PS-Net). Specifically, it contains pore feature extraction with coarse labels generated by a traditional method, as well as fine segmentation with progressively updated pseudo labels. Since pores provide high-frequency information about faces, we propose a high-frequency attention module that emphasizes low-level features. Moreover, we design a Bayesian module to identify pore shapes in high-level features. We establish a large-scale facial pore dataset with coarse labels that were generated via the difference of Gaussian (DoG) Pore method. PS-Net achieves the best performance on this dataset, proving its superiority compared with existing state-of-the-art segmentation methods.
This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Comments on this article