I am currently a research scientist in the Machine Learning Team at NIH/NIMH, where I collaborate with experimentalists across a wide range of fields to build computational models to explain neural data. My primary research interest is driven by developing models to understand computations carried out in the brain. I am particularly passionate about the future of integrating end-to-end task-driven learning on neural data with both behavioral and neuronal interventions, aiming to uncover specific mechanisms and objectives in neural processings.
Prior to NIH, I received my joint PhD in Neural Computation and Machine Learning at Carnegie Mellon University, where I had the privilege of being advised by by Mike Tarr and Leila Wehbe. My PhD works have focused on modeling visual and semantic processing in the human brain, where I collected data using functional Magnetic Resonance Imaging (fMRI) and used models from computer vision and natural language processing to model brain responses of viewing natural images and movies.
Before CMU, I received a B.A. in Statistics and Cognitive Science from UC Berkeley.
Other than research, I enjoy bouldering, mountaineering, cooking, and reading.
Joint Natural Language and Image Pre-training Builds Better Models of Human Higher Visual Cortex [paper] [poster] [talk]
Aria Y. Wang, Kendrick Kay, Thomas Naselaris, Michael J. Tarr, Leila Wehbe
In press at Nature Machine Intelligence.
Selectivity for Food in Human Ventral Visual Cortex [paper]
Nidhi Jain, Aria Y. Wang, Margaret M Henderson, Ruogu Lin, Jacob S Prince, Michael J. Tarr, Leila Wehbe
Communications Biology, 2023.
Joint Interpretation of Representations in Neural Network and the Brain [paper]
Aria Y. Wang*, Ruogu Lin*, Michael J. Tarr, Leila Wehbe
ICLR Workshop 2021 - ‘How Can Findings About The Brain Improve AI Systems?’
Neural Taskonomy: Inferring the Similarity of Task-Derived Representations from Brain Activity [paper] [poster] [website]
Aria Y. Wang, Michael J. Tarr, Leila Wehbe
Neural Information Processing Systems (NeurIPS) 2019
Learning Intermediate Features of Object Affordances with a Convolutional Neural Network [paper]
Aria Y. Wang, Michael J. Tarr
Conference on Cognitive Computational Neuroscience (CCN) 2018
*Equal contribution
Image Embeddings Informed by Natural Language Improve Predictions and Understanding of Human High-Level Visual Cortex
Aria Y. Wang, Michael J. Tarr, Leila Wehbe
Poster presented at Conference on Cognitive Computational Neuroscience (CCN) 2022
Image Embeddings Informed by Natural Language Improve Predictions and Understanding of Human High-Level Visual Cortex [youtube]
Aria Y. Wang, Michael J. Tarr, Leila Wehbe
Talk presented at the 4th Neuromatch conference 2021
Joint Interpretation of Representations in Neural Network and the Brain
Aria Y. Wang*, Ruogu Lin*, Michael J. Tarr, Leila Wehbe
Talk presented at the ICLR Workshop 2021 - “How Can Findings About The Brain Improve AI Systems?”
Neural Taskonomy: Inferring the Similarity of Task-Derived Representations from Brain Activity
Aria Y. Wang, Leila Wehbe, Michael J. Tarr
Poster presented at Conference on Neural Information Processing Systems (NeurIPS) 2019
Expanding Visual Feature Spaces towards a General Encoding Model of Scene Perception
Aria Y. Wang, Michael J. Tarr, Leila Wehbe
Poster presented at Society for Neuroscience (SfN) 2019
Neural Taskonomy: Inferring the Similarity of Task-Derived Representations from Brain Activity
Aria Y. Wang, Leila Wehbe, Michael J. Tarr
Poster presented at Algonauts Workshop at MIT
Learning Intermediate Features of Affordances with a Convolutional Neural Network
Aria Y. Wang, Michael J. Tarr
Poster presented at Conference on Cognitive Computational Neuroscience (CCN) 2018
Learning Intermediate Features of Affordances with a Convolutional Neural Network
Aria Y. Wang, Michael J. Tarr
Poster Presented at Annual Meeting of the Vision Science Society (VSS) 2018
*Equal contribution