Strong and Precise Modulation of Human Percepts via Robustified ANNs

TitleStrong and Precise Modulation of Human Percepts via Robustified ANNs
Publication TypeConference Paper
Year of Publication2023
AuthorsGaziv, G, Lee, MJ, DiCarlo, JJ
Conference NameNeural Information Processing Systems
Conference LocationNew Orleans, Louisiana
Abstract

The visual object category reports of artificial neural networks (ANNs) are notoriously sensitive to tiny, adversarial image perturbations. Because human category reports (aka human percepts) are thought to be insensitive to those same small-norm perturbations – and locally stable in general – this argues that ANNs are incomplete scientific models of human visual perception. Consistent with this, we show that when small-norm image perturbations are generated by standard ANN models, human object category percepts are indeed highly stable. However, in this very same “human-presumed-stable” regime, we find that robustified ANNs reliably discover low-norm image perturbations that strongly disrupt human percepts. These previously undetectable human perceptual disruptions are massive in amplitude, approaching the same level of sensitivity seen in robustified ANNs. Further, we show that robustified ANNs support precise perceptual state interventions: they guide the construction of low-norm image perturbations that strongly alter human category percepts toward specific prescribed percepts. In sum, these contemporary models of biological visual processing are now accurate enough to guide strong and precise interventions on human perception.

URLhttps://openreview.net/pdf?id=5GmTI4LNqX
Refereed DesignationRefereed

File: