Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Acquiring high-resolution light fields (LFs) is expensive. LF angular superresolution aims to synthesize the required number of views from a given sparse set of spatially high-resolution images. Existing methods struggle with sparsely sampled LFs captured with large baselines. Some methods rely on depth estimation and view reprojection, and are sensitive to textureless and occluded regions. Other non-depth based methods suffer from aliasing or blurring effects due to the large disparity. In addition, most methods require specific models for different interpolation rates, which reduces their flexibility in practice. In this paper, we propose a learning framework that overcomes these challenges by exploiting the global and local structures of LFs. Our framework includes aggregation across both the angular and spatial dimensions to fully exploit the input data and a novel bilateral upsampling module that upsamples each epipolar plane image while better preserving its local parallax structure. Furthermore, our method predicts the weights of the interpolation filters based on both subpixel offset and range difference, allowing angular superresolution at different rates with a single model. We show that our non-depth based method outperforms the state-of-the-art methods in terms of handling large disparities and flexibility on both real-world and synthetic LF images.
This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Comments on this article