Texture Synthesis and Image Colorization
vendredi 6 mars 2020, 14h00 - 16h00
In this talk I will discuss two problems: exemplar-based texture synthesis and image colorization.
Exemplar-based texture synthesis is the process of generating, from an input texture sample, new texture images that are perceptually equivalent to the input. We proposed to generate new textures by modeling self-similarities with conditional multivariate Gaussian distributions in the image patch space. This approach is embedded in a multi-scale framework, thus reducing the method’s dependency on the patch size. I will discuss the use of this method and some recent neural network methods for the case of high resolution texture images . 
Colorization is the process of adding plausible color information to monochrome photographs or videos. This is used for example in advertising and the film industry. Although important progress has been achieved, automatic image colorization still remains a challenge. We proposed an adversarial learning approach which incorporates semantic information, trained using a fully self-supervised strategy. This results in a generative network that infers the chromaticity of a grayscale image conditioned to semantic clues. Experimental results show that our method provides photo-realistic colorful images. 
 L. Raad, A. Davy, A. Desolneux, J.-M. Morel. A survey of exemplar-based texture synthesis. AMSA 2018.
 P. Vitoria, L. Raad, C. Ballester. ChromaGAN: Adversarial Picture Colorization with Semantic Class Distribution. WACV 2020