![]() |
Thinking in Pictures This project was inspired by the book Thinking in Pictures by Temple Grandin, a professor of animal science who is also on the autism spectrum and who feels that she is a visual thinker. The goal of this project is better understand how visual thinkers process information and experience the world around them. This project involves building and studying visual-imagery-based AI systems, and also developing new assessments to measure visual thinking in people. |
Selected publications
|
|
![]() |
Teaching Social Skills Visually In this project, we are working with collaborators in Vanderbilt’s Open-Ended Learning Environments group and at the Vanderbilt Kennedy Center’s Treatment and Research Institute for Autism Spectrum Disorders (TRIAD) to develop new, visually-oriented, technology-based approaches for teaching theory of mind and social skills to adolescents on the autism spectrum. |
Selected publications
|
|
![]() |
Attention and Wearable Cameras Visual attention impacts virtually every aspect of intelligent behavior in humans, from perception and learning to communication and social interaction. Recent advances in wearable technology now enable us to measure human visual attention in real-world settings. This project leverages wearable camera and eye-tracking technologies to support research into the relationships between visual attention, learning, and intelligent problem-solving. |
Selected publications
|
|
![]() |
Data Visualization The goal of this project is to understand visual cognition in the context of human data visualization activities, including studying and modeling the roles of visual perception (what you see), semantic knowledge (what you know), and goals (what you are trying to do). These models will help to identify factors that contribute to human performance on data visualization tasks and will also lay foundations for developing new intelligent data visualization technologies. |
Selected publications
|
|
![]() |
AI in Cognitive Assessments The goal of this project is to develop new AI tools that improve the usefulness of standardized cognitive assessments that are used in research and clinical practice. We focus mostly on nonverbal cognitive assessments, such as Raven’s Progressive Matrices, Leiter, Embedded Figures, and Block Design, and we examine how AI models can be used to make more detailed inferences about human response patterns. |
Selected publications
|
|
![]() |
Developmentally Inspired AI Some biologically-inspired approaches to AI aim to emulate the neural structure of the brain. This project takes a parallel approach of looking at developmental aspects of human intelligence. In particular, we study how the physical environment, the maturation of motor and attentional skills, and interactions with social actors all play role in defining what, and how, human infants learn about the world. |
Selected publications
|
-
Latest News
- Chris Ketchum has poem accepted for publication Jun 13, 2019
- Roxanne Rashedi has abstract accepted for oral presentation at AME 2019 Jun 1, 2019
- Maithilee Kunda gives invited talk at NAS Colloquium May 2, 2019
- Two papers accepted to CogSci 2019 Apr 11, 2019
- VU spotlight on Ben Scheer’s work with orangutans Feb 22, 2019
- Roxanne Rashedi presents poster at IES PI meeting Jan 9, 2019
- Temple Grandin visits our Imagery-Based AI class Nov 29, 2018
- Vanderbilt’s new Frist Center for Autism and Innovation Nov 8, 2018
- Ellis Brown, Fernanda Eliott, and Xiaotian Wang present work at ACS 2018 Aug 20, 2018
- Paper on visual imagery in AI published in journal Cortex Jul 28, 2018
- New $1.4M IES grant for teaching social skills to students on the autism spectrum Jul 1, 2018
- Tengyu Ma and Sean Cha present Toybox at CVPR workshop Jun 18, 2018
- Toybox dataset online! Jun 15, 2018
- Two papers accepted to CogSci 2018 Apr 13, 2018
- Maithilee Kunda presents lecture at AAAS headquarters Dec 4, 2017