Illumination Invariant Skin Texture Generation Using CGAN from a Single Image for Haptic Augmented PalpationDownload PDFOpen Website

Published: 01 Jan 2019, Last Modified: 17 May 2023IRC 2019Readers: Everyone
Abstract: The problem of illumination normalization in the field of computer vision is a subject that has been studied steadily so far. However, the treatment of strong illumination has not been done to date. In this paper, we present a method of illumination normalization using a deep learning method, the conditional generative adversarial network (CGAN), to reconstruct accurate 3D skin textures from a single image toward efficient haptic palpation. After normalizing the illumination through the conditional generative adversarial network in which the input image is conditioned to obtain the desired output image, bilateral filtering that removes noise while preserving the boundaries enhances fine wrinkles. As the refinement process of enhancing skin textures, intrinsic image decomposition is performed to obtain a shading layer image with histogram equalization that is merged into the normalized image obtained by the illumination normalization process to enhance skin tactile properties (wrinkle and roughness). Through these processes, we can obtain a depth image with normalized illumination and enhanced skin wrinkle texture. Using this, the depth of the skin surface texture is restored in three dimensions precisely. The superiority of the illumination normalization method proposed in this paper over three other illumination normalization methods (CIDRE, LDCT, and TT) is verified through comparison. We also confirm the illumination normalization performance of our method through restoration of the three-dimensional skin surface.
0 Replies

Loading