Computer Vision News - July 2016

The above figure shows the visualizations, for the three different DNNs (AlexNet, GoogLeNet, VGGnetwork) evaluated by the authors; the results are demonstrated on the pre-softmax layer. Line (a) shows the sensitivity map of Simonyan et al. (2013). Line (b): prediction difference with Robnik-Sikonja and Kononenko (2008) method Line (c): the same result overlaid with the input image. Lines (d)+(e) show the results of the new proposed method. Red areas indicate evidence for the class, while blue areas indicates evidence against the class. Transparent areas indicate that the pixels did not have any influence on the prediction. For example, large parts of the cat’s face are blue for the GoogLeNet, while the ear is red with high intensity. This indicates that the classifier does not recognize the face as being indicative of the tabby cat class, while the ear appears very distinctive. One obvious difference between the sensitivity map (line a) and the proposed method (lines d, e) is the signed information the proposed method has, while the sensitivity map has an only boolean (yes/no) values. It can be observed that the sensitivity analysis highlights the class object in the image while the proposed method emphasizes that a feature does not necessarily highlight the object itself, but the things that the classifier uses to detect what is in the image, which can also be contextual information. In addition, the authors compare their method to the seminal work by Simonyan et al. (2013), which proposes image-specific class saliency visualization to rank the input features based on their influence on the assigned class c. Simonyan et al. compute the partial derivative of the (pre-softmax) class score Sc with respect to the input features xi (which are typically pixels), si = ∂Sc/∂xi, to estimate each feature relevance. In other words, this expresses how sensitive the classifier’s prediction is to small changes of feature values in input space. Computer Vision News Research 29 Research

RkJQdWJsaXNoZXIy NTc3NzU=