Conditional 360-degree Image Synthesis for Immersive Indoor Scene Decoration

1Hong Kong University of Science and Technology, 2Trinity College Dublin, 3VinAI Research, Vietnam, 4Deakin University
ICCV 2023

Undecorated 360° indoor scenes

Decorated 360° indoor scenes produced by our method

Abstract

In this paper, we address the problem of conditional scene decoration for 360-degree images. Our method takes a 360-degree background photograph of an indoor scene and generates decorated images of the same scene in the panorama view. To do this, we develop a 360-aware object layout generator that learns latent object vectors in the 360-degree view to enable a variety of furniture arrangements for an input 360-degree background image. We use this object layout to condition a generative adversarial network to synthesize images of an input scene. To further reinforce the generation capability of our model, we develop a simple yet effective scene emptier that removes the generated furniture and produces an emptied scene for our model to learn a cyclic constraint. We train the model on the Structure3D dataset and show that our model can generate diverse decorations with controllable object layout. Our method achieves state-of-the-art performance on the Structure3D dataset and generalizes well to the Zillow indoor scene dataset. Our user study confirms the immersive experiences provided by the realistic image quality and furniture layout in our generation results.

Video

ICCV2023 presentation video will be available later.

Network Structure




User Control

BibTeX

@article{shum2023conditional,
  title={Conditional 360-degree Image Synthesis for Immersive Indoor Scene Decoration},
  author={Shum, Ka Chun and Pang, Hong-Wing and Hua, Binh-Son and Nguyen, Duc Thanh and Yeung, Sai-Kit},
  journal={arXiv preprint arXiv:2307.09621},
  year={2023}
}