Home

The AIVAS Lab does research at the intersection of artificial intelligence and cognitive science, in the area of computational cognitive systems. Most of our research involves studying how visual mental imagery contributes to learning and intelligent behavior in humans and in AI systems, with applications in cognitive assessment and special education, especially in relation to autism and other neurodiverse conditions. Many of our research directions were heavily inspired by the experiences and writings of Dr. Temple Grandin and others on the autism spectrum.

February 2020 – AIVAS Lab members with Anderson Cooper!

Our work featured on 60 Minutes

Video ▶ “Recruiting for talent on the autism spectrum” on CBS 60 Minutes with Anderson Cooper (Vanderbilt portion begins at 7:40).

Video ▶ A 90-second snippet from our portion of this 60 Minutes piece.

Research behind the demo featured on 60 Minutes, including a more in-depth look at the data from our volunteers Dan Burger and Anderson Cooper.

Our Goals
Most of the research that we do follows two main pathways. First, we build and study AI systems as a way to understand how people think, looking especially at differences across neurotypical and neurodiverse populations. Second, we use findings from cognitive science to advance the state of the art in AI, especially to develop new AI techniques for solving complex problems using visual imagery.

What are Visual Analogical Systems?
The term analogical means that something has an organized relationship with something else–like an analogy. In AI and cognitive science, analogical representations refer to a way of portraying information that retains a correspondence with the real world. For example, an image of a cat is analogical because the 2D spatial information contained in the image corresponds to what the cat looks like in real life. On the other hand, the word “cat” is not an analogical representation, because it has no such correspondence. Most of the AI systems that we build use visual analogical representations as the core knowledge representations that support learning, problem solving, and other intelligent behaviors.

Summer 2019

8-cropped-more
16-cropped-more

Summer/Fall 2019 – Orangutans!

orangutan_app

See orangutans using our cognitive enrichment app at Zoo Atlanta!

Video ▶ Dumadi uses the musical instruments “game”—apparently his favorite part of the app.

Video ▶ Madu is surprised (and touched, according to the zookeepers who knew her) to see a video of her late orangutan friend Alan in the “Zoo-Videos-Youtube” part of the app.

Video ▶ App Inspiration leads Vanderbilt student to code for orangutans

Find Your Impact: Student creates app for orangutans. Vanderbilt Research News. Feb. 22, 2019.

Scheer, B., Renteria, F. C., and Kunda, M. (2019). Technology-based cognitive enrichment for animals in zoos: A case study and lessons learned. In Proceedings of the 41st Annual Meeting of the Cognitive Science Society, p. 2741-2747. [pdf]

Spring 2019

spring2019

Fall 2018: Temple Grandin visits our Imagery-based AI class!

Video ▶ Short video highlighting this classroom visit

Video ▶ Dr. Grandin’s Chancellor’s Lecture

Grandin rejects low expectations, insists workforce critically needs people with autism in VU lecture, Vanderbilt News, Nov. 30, 2018.

Summer 2018b (Successful escape!)

2018 room escape cropped

Summer 2018a

arcade

Spring 2018

mckay

Summer 2017

summer2017_potluck_small

Spring 2017 (“We ^almost escaped”…)

spring2017_escape_smaller

Summer 2016 (An exercise in gradient descent)

lab mini golf cropped