Ivan Sosnovik

I am a PhD student at Delta Lab at the University of Amsterdam, supervised by Arnold Smeulders. My research mainly focuses on developing networks that are equivariant to principle transformations of the data.

I received my MSc from Phystech and Skoltech, where I worked on the connection of neural networks and topology optimization under the supervision of Ivan Oseledets. I did my bachelor's at Phystech, where I studied experimental low-temperature physics.

Email | CV | Google Scholar | GitHub | Twitter

avatar

Publications

Scale Equivariance Improves Siamese Tracking

Ivan Sosnovik*, Artem Moskalev*, Arnold Smeulders

WACV, 2021

In this paper, we develop the theory for scale-equivariant Siamese trackers. We also provide a simple recipe for how to make a wide range of existing trackers scale-equivariant to capture the natural variations of the target a priori.


Scale-Equivariant Steerable Networks

Ivan Sosnovik, Michał Szmaja, Arnold Smeulders

ICLR, 2020

We introduce the general theory for building scale-equivariant convolutional networks with steerable filters. We develop scale-convolution and generalize other common blocks to be scale-equivariant.


Semi-Conditional Normalizing Flows for Semi-Supervised Learning

Andrei Atanov, Alexandra Volokhova, Arsenii Ashukha, Ivan Sosnovik, Dmitry Vetrov

ICML INNF, 2019

This paper proposes a semi-conditional normalizing flow model for semi-supervised learning. The model uses both labelled and unlabeled data to learn an explicit model of joint distribution over objects and labels.


Neural Networks for Topology Optimization

Ivan Sosnovik, Ivan Oseledets

Russian Journal of Numerical Analysis and Mathematical Modelling 34 (4), 2019

In this research, we propose a deep learning based approach for speeding up the topology optimization methods. We formulate the problem as image segmentation and leverage the power of deep learning to perform pixel-wise image labeling.

Open Source Projects

Teaching

Students

Reviewing