About Our Project

In natural environments, events typically produce a combination of energies that activate receptors of multiple sensory modalities (e.g., a bird flapping its wings will activate visual and auditory sensory receptors). Such multi-sensory activation results in perceptual advantages: the detection/localization/identification is enhanced relative to energies that activate only single modality receptors, and further, multi-sensory activation results in faster behavioral responses relative to uni-sensory activation. These behavioral processes have been well studied, but the neuronal basis has not been well studied. Very specifically, it is not clear how the activity patterns of neurons relate to perceptual variability on a trial by trial basis, and it is not clear exactly what the critical brain sites are for multi-sensory perception. In addition, it is not clear what the computations are that underlie these behavioral and neurophysiological responses. It is to determine these that we are doing these experiments.

Experimental Details:
Monkeys will orient their eyes towards the most important event in an event rich environment. These events can be auditory, visual, or combined auditory-visual. The monkeys will be rewarded for their correct choice. Simultaneous recordings of neurophysiological and behavioral activity will allow us to correlate the two to answer one of the big questions. Our big concern is to generate visual stimuli for which we can alter the perceptual reliability.

Design Goal:
We would like to design a bank of LEDs (16 x 16 or 32 x 32) that will cover the frontal visual field of the monkey. This patch will be controlled by an Arduino that can be added on to an existing auditory system. The LEDs should be addressable in two modes:

  1. Individual LEDs
  2. Gaussian or Gabor patches that can be addressed based on the center of the patch and the standard deviation/width measure. In addition, the peak luminance of this patch will be controlled by the Arduino system.