Computer Vision News - May 2019

These criteria were used as the basis for a general-purpose Double-DIP architecture: • The first criterion is enforced by minimizing Reconstruction Loss, which measures the error between the recomposed image and the input image. • The second criterion is achieved by employing multiple DIPs - one per layer. • The third criterion is maintained through the Exclusion Loss between the output of the different DIP networks minimizing their correlation. • Each DIP network recovers its own layer of the input image I . The input to all DIP i s is randomly sampled noise - designated . The outputs of all the different DIP networks-designated = ( ) are mixed via a weight mask to create a recomposed image: = ⋅ 1 + (1 − ) ⋅ 2 which should be as close as possible to the input image I . For some tasks the mask m is simple and known, in other cases it must be learned (using another DIP network). The learned mask can be uniform or have varied characteristics for different areas of the image, can be continuous or binary. These constraints for m are task-dependent, and enforced through tailoring the Regularization Loss term and parameter to the task. Thus, the Optimization Loss is: = + ⋅ + ⋅ ∗ −0.1 where = − and (the Exclusion Loss) minimizes the correlation between the gradients of y 1 and y 2 . And is a task-specific mask regularization. The training and optimization of a double-DIP network are similar to those of the basic DIP. The addition of extra noise to the input gradually increased with iterations, increases the robustness of recomposition, as does data augmentation of the input image I by the addition of noise produced by 8 transformations (4x 90 ∘ rotations and 2x horizontal and vertical reflections). Optimization uses the adam optimization tools and takes a few minutes per image using a Tesla V100 GPU. Results: Of the many results described in the article -- we will discuss two: 1) image segmentation into Fg / Bg, and 2) watermark removal. Research 8 Research Computer Vision News

RkJQdWJsaXNoZXIy NTc3NzU=