Md Nasir Uddin Laskar, L G Sanchez Giraldo, O Schwartz. Deep learning captures V2 selectivity for natural textures. Computational and Systems Neuroscience (Cosyne) abstract, 2017. There has been substantial recent progress in building supervised deep convolutional neural networks (CNNs) of visual scenes. The highest layers of CNNs have been intriguingly linked to high level object recognition areas in visual cortex, and there has been some indication that middle layers are more predictive of mid-level visual areas. Here we focus on studying the second layer of CNNs and the potential compatibility with visual area V2. Recent experimental work and analyses provide a compelling case for V2 selectivity to visual textures [1], [2]. Following these experiments, we probed CNNs with synthesized textures and spectrally matched noise stimuli to see if there is any correspondence between CNNs and the biological vision system. We found a good qualitative correspondence between the deep convolutional network layers 1 and 2, and some experimental results in areas V1 and V2. We evaluated this across a number of metrics: modulation index for texture selectivity; between versus within texture family variance; t-SNE visualization of texture clustering; representational similarity analysis; and texture recognition scores. Our findings will lead to better understanding of mid-level visual cortical representations and how they may develop hierarchically. [1] J. Freeman, C. M. Ziemba, D. J. Heeger, E. P. Simoncelli, and J. A. Movshon, A functional and perceptual signature of the second visual area in primates, Nature Neuroscience, vol. 16, no. 7, pp. 974-981, 2013. [2] C. M. Ziemba, J. Freeman, A. Movshon, and E. P. Simoncelli, Selectivity and tolerance for visual texture in macaque V2, Proceedigs of National Academy of Science (PNAS), vol. 113, no. 22, 2016.