Yao, LiChyau, AngLee, Jehee and Theobalt, Christian and Wetzstein, Gordon2019-10-142019-10-1420191467-8659https://doi.org/10.1111/cgf.13852https://diglib.eg.org:443/handle/10.1111/cgf13852In this paper, we propose a unified neural network for panoptic segmentation, a task aiming to achieve more fine-grained segmentation. Following existing methods combining semantic and instance segmentation, our method relies on a triple-branch neural network for tackling the unifying work. In the first stage, we adopt a ResNet50 with a feature pyramid network (FPN) as shared backbone to extract features. Then each branch leverages the shared feature maps and serves as the stuff, things, or mask branch. Lastly, the outputs are fused following a well-designed strategy. Extensive experimental results on MS-COCO dataset demonstrate that our approach achieves a competitive Panoptic Quality (PQ) metric score with the state of the art.Computing methodologiesImage segmentationNeural networksA Unified Neural Network for Panoptic Segmentation10.1111/cgf.13852461-468