Computer Vision News - September 2018

18 Tool Computer Vision News Focus on… How can we arrive at an aligned interpolation, which will have stable locations for visual landmarks? A shared parameterization is one approach to tackle this. In shared parameterization, each epoch (to be explained below) has a number of unique parameters and one parameter in common. By making some of the parameters shared between neurons this way, we make the visualizations produced naturally align. This technique constrains nearby regions to share their parameters and as a result greatly reduces localization- changes between visualizations. Let’s go deeper into this with an example of some coding. Using Lucid’s custom parameterization capability to build a shared parameterization that encourages alignment of visual landmarks, we’ll create visualizations that are interpolations of two feature visualizations. Let's take a look at the two neurons: Lucid render.render_vis controls visualization using a few components which you can fiddle with in a completely independent way: neuron1 = ("mixed4a_pre_relu", 476) neuron2 = ("mixed4a_pre_relu", 460) for neuron in [neuron1, neuron2]: param_f = lambda: param.image(128) objective = objectives.channel(*neuron) _ = render.render_vis(model, objective, param_f)

RkJQdWJsaXNoZXIy NTc3NzU=