PCCNet: A Few-Shot Patch-wise Contrastive Colorization Network

Xiaying Liu, Ping Yang, Alex Telea, Jiri Kosinka, Zizhao Wu*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

Few-shot colorization aims to learn a model to colorize images with little training data. Yet, existing models often fail to keep color consistency due to ignored patch correlations of the images. In this paper, we propose PCCNet, a novel Patch-wise Contrastive Colorization Network to learn color synthesis by measuring the similarities and variations of image patches in two different aspects: inter-image and intra-image. Specifically, for inter-image, we investigate a patch-wise contrastive learning mechanism with positive and negative samples constraint to distinguish color features between patches across images. For intra-image, we explore a new intra-image correlation loss function to measure the similarity distribution which reveals structural relations between patches within an image. Furthermore, we propose a novel color memory loss that improves the accuracy of the memory module to store and retrieve data. Experiments show that our method allows the correctly saturated color to spread naturally over objects and also achieves higher scores in quantitative comparisons with related methods.
Original languageEnglish
Title of host publicationAdvances in Computer Graphics
Subtitle of host publication40th Computer Graphics International Conference, CGI 2023, Shanghai, China, August 28–September 1, 2023, Proceedings, Part II
PublisherSpringer Nature
Pages349-361
ISBN (Electronic)978-3-031-50072-5
ISBN (Print)978-3-031-50071-8
DOIs
Publication statusPublished - 29 Dec 2023

Publication series

NameLecture Notes in Computer Science
Volume14496
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Fingerprint

Dive into the research topics of 'PCCNet: A Few-Shot Patch-wise Contrastive Colorization Network'. Together they form a unique fingerprint.

Cite this