Computer Vision News - November 2020
11 GAN-CIRCLE minimum dosage of radiation which is advantageous in the CT field. Traditionally there are two pathways to increase the resolution on CT imaging: a) define the hardware requirements and b) explore computational algorithms. The obvious choice, for various reasons, such as radiation exposure and system costs, is to use algorithms for deblurring. Nevertheless, this is an ill-posed, inverse problem and thus complex to solve. Similar to MRI, model-based approaches, neural network approaches and a combination of those have been previously introduced by either utilising the system priors or -in the second case- by creating a non-linear mapping from a pair of LR/HR images to recover the missing high-frequency details. Super-resolution images can be achieved by using CNNs leveraging hierarchical features and representations. The image above describes the main architecture of the proposed GAN-CIRCLE architecture. Two generators were used (G,F) and a combined loss comprised of: generator-adversarial loss, cycle-consistency loss, identity loss and joint sparsifying transform loss. The main idea of this network is to create a blocks of non-linearity SR (SRCT) together with a residual module to learn high-frequency details. The adversarial learning is performed in a cyclic manner which results in better perceptually and quantitatively SRCT images. The supplementary material, has a very detailed analysis of the network design, which is worthwhile to have a look! As thenetwork is optimized for the real application, the generator Gplays an important role. G is trained to create a source image mapping in a source domain X to the y target images (existing the Y target domain). As there are typical GAN limitations in the way the architecture is used in practice, in this paper the mappings are trained jointly. The idea here is to produce images that “confuse” the discriminators (DY,DX) which are trying to identify if the output each generative mapping is real or artificial.
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=