Cas-FNE: Cascaded Face Normal Estimation
作者机构:the Tianjin Key Laboratory of Autonomous Intelligence Technology and Systems the School of Computer Science and Technology Tiangong University IEEE the College of Intelligence and ComputingTianjin University the Electronic Information School Wuhan University
出 版 物:《IEEE/CAA Journal of Automatica Sinica》 (自动化学报(英文版))
年 卷 期:2024年第11卷第12期
页 面:2423-2434页
核心收录:
学科分类:08[工学] 080203[工学-机械设计及理论] 0802[工学-机械工程]
基 金:supported by the National Natural Science Foundation of China (62072327)
主 题:Training Graphics Computational modeling Neural networks Training data Predictive models Network architecture Data models Faces Optimization
摘 要:Capturing high-fidelity normals from single face images plays a core role in numerous computer vision and graphics applications. Though significant progress has been made in recent years, how to effectively and efficiently explore normal priors remains challenging. Most existing approaches depend on the development of intricate network architectures and complex calculations for in-the-wild face images. To overcome the above issue, we propose a simple yet effective cascaded neural network,called Cas-FNE, which progressively boosts the quality of predicted normals with marginal model parameters and computational cost. Meanwhile, it can mitigate the imbalance issue between training data and real-world face images due to the progressive refinement mechanism, and thus boost the generalization ability of the model. Specifically, in the training phase, our model relies solely on a small amount of labeled data. The earlier prediction serves as guidance for following refinement. In addition, our shared-parameter cascaded block employs a recurrent mechanism, allowing it to be applied multiple times for optimization without increasing network parameters. Quantitative and qualitative evaluations on benchmark datasets are conducted to show that our Cas-FNE can faithfully maintain facial details and reveal its superiority over state-of-the-art methods. The code is available at https://***/AutoHDR/***.