Abstract: Recent advancements in virtual reality (VR) and augmented reality (AR) have popularised the emerging panoramic content for the immersive visual experience. The difficulty in acquisition and display of 360° format further highlights the necessity of unconditional panoramic image generation. Existing methods essentially generate planar images mapped from panoramic images, and fail to address the deformation and closed-loop characteristics when inverted back to the panoramic images. Thus leading to the generation of pseudo-panoramic content. This paper aims to directly generate spherical content, in a patch-by-patch style; besides computation friendly, this promises the anywhere continuity on the panoramic image and proper accommodation of panoramic deformation. More specifically, we first propose a novel spherical patch convolution (SPConv) that operates on the local spherical patch, which naturally addresses the deformation of panoramic content. We then propose our spherical patch generative adversarial net (SP-GAN) that consists of spherical local embedding (SLE) and spherical content synthesiser (SCS) modules, which seamlessly incorporate our SPConv so as to generate continuous panoramic patches. To the best of our knowledge, the proposed SP-GAN is the first successful attempt to accommodate the spherical distortion for closed-loop panoramic image generation in a patch-by-patch manner. The experimental results, with human-rated evaluations, have verified the consistently superior performances for unconditional panoramic image generation, from the perspectives of generation quality, computational memory, and generalisation to various resolutions. Codes are publicly available at https://github.com/chronos123/SP-GAN
Loading