Bates College has received a National Science Foundation grant of $3.97 million to create a groundbreaking Visual Experience Database to support research in fields that rely on the analysis and recognition of images, such as neuroscience, cognitive science, and artificial intelligence.

Equipped with a new lab in Hathorn Hall, Assistant Professor of Neuroscience Michelle Greene studies visual perception. (Theophil Syslo/Bates College)

Assistant Professor of Neuroscience Michelle Greene is the principal investigator for the project to create a Visual Experience Database (Theophil Syslo/Bates College)

The largest-ever federal grant awarded to Bates, the four-year award will fuel the creation of a vast gallery of videos that depict what, and how, people see as they go about daily activities. Bates developed the grant proposal collaboratively with researchers at North Dakota State University and the University of Nevada, Reno.

Michelle R. Greene, an assistant professor of neuroscience at Bates who studies how the brain makes sense of what we see, is the principal investigator for the project.

“I’m delighted and overwhelmed,” said Greene, “and intensely excited. I have a terrific team of co-principal investigators, so fostering and furthering those connections will make these next four years really fun.”

The co-principal investigators are Benjamin Balas, a neuroscientist and associate professor of psychology at North Dakota, and Paul MacNeilage and Mark Lescroart, neuroscientists and assistant professors of psychology at Nevada.

“We are honored to be the lead partner in this multi-state collaboration,” said Bates President Clayton Spencer. “This grant is important for Maine, Nevada, and North Dakota, and it also has the potential for significant impact on the future of vision research, neuroscience, and artificial intelligence.”

“It’s meaningful that the National Science Foundation has chosen a national liberal arts college like Bates to take the lead on this project.”

“This grant brings deserved recognition to Professor Greene, an exemplary member of the Bates faculty who has contributed significantly to the body of published work on visual perception and who engages students extensively in her research,” said Malcolm Hill, vice president for academic affairs and dean of the faculty.

“It’s meaningful that the National Science Foundation has chosen a national liberal arts college like Bates to take the lead on this project,” Hill added. “Indeed, Bates is well-positioned to collaborate on the work to create a Visual Experience Database to support researchers around the world as they grapple with, and bring greater understanding to, the social and ethical consequences of computer vision and artificial intelligence.”

Sarah Rothmann '19 of Andover, Mass., participates as a subject in an EEG neuroscience thesis experiment for a first-person story she is writing for the Bates Communications Office. Hanna De Bruyn ‘18, Old Lyme, Conn., is the thesis student who is working on the supervision of Michelle Greene, assistant professor of neuroscience in the Bates Computational Vision Lab (Hathorn 108). “We are piloting the experiment for these students’ thesis experiments. They were piloting Hanna’s experiment. She’s interested in looking at the extent to which visual masking actually inhibits perception. So when you take a visual mask, you take an image followed by another image, you’re impaired at understanding the first image. The question is why. So what we’re going to do is take the neural activity that we’re measuring. And the nice thing about EEG is that it measures millisecond by millisecond electrical potentials that are generated in the brain , we measure them from the scalp. And we can see over time what the brain is processing and we use machine learning, we put these signals into a computer system tha t reads out the extent to which there is information about what the picture is. We’re wondering, does that information persist when you change the image? Does that persist over time? Hannah’s made the experiment, and we are going to try it out to make sure everything’s ready for participants.” -- Michelle Greene, assistant professor of neuroscience, says of three thesis students in neuroscience: “They’re all terrific, I might add.” Hanna De Bruyn ‘18, Old Lyme, Conn. Katherine “Katie” Hartnett ’18 of St. Paul, Minn., and Julie Self ’18 of Redwood City, Calif. Hanna is the only student to appear in this set of pictures.

Student researcher Hanna De Bruyn ’18 (left) works with Assistant Professor of Neuroscience Michelle Greene to prepare an EEG test in March 2018. (Phyllis Graber Jensen/Bates College)

“Maine’s colleges and universities are consistently at the forefront of groundbreaking research that improves people’s lives and enhances our understanding of the world around us,” said U.S. Sens. Susan Collins and Angus King of Maine in a joint statement.

They added, “Through this funding, Bates College will partner with two other universities to build a database to study human behavior and development through first-person experiences. We applaud the NSF’s investment in Bates’ project, which will help advance the field of vision science.”

The VED will comprise more than 240 hours of video created specifically for this project and findable through a publicly accessible database. Wearing cameras that simulate human vision, as well as devices to track head and eye movements, observers will undertake routine activities such as walking, shopping, or touring a museum.

Because they were not intended for research purposes, existing still and moving images are compromised by the many biases their creators bring to them.

By enlisting diverse observers local to each of the three participating institutions, the project will record how changes in environment, age, and task affect the act of looking.

Much of the data used in such fields as visual neuroscience, psychology, computer vision (a branch of artificial intelligence), and computational sociology consists of vast collections of still and moving images. These are curated largely from public online resources such as YouTube and Google — but because they were not intended for research purposes, they are compromised by the many biases their creators bring to them.


Artificial intelligence systems have biases because they’re not being fed enough solid data, says Michelle Greene.

The reasons someone may choose a particular photo subject, frame an image a certain way, or upload one image and not another are all biases that diminish the material’s value as data. Such “biases exist at every level,” said Greene, “and all of the databases that we’ve been using for years are subject to them.” The VED assets, in contrast, will be created specifically to represent ordinary scenes and will be subject to experimental controls.

Undergraduates at all three schools, including 28 at Bates over the four-year grant period, will take part in the research. Among other roles, students will serve as videographers, creating assets for the VED, Greene said.

“I can imagine that next summer there’s going to be a small army of folks going out into various parts of the world, seeing what the world looks like when we’re hiking, when we’re at the beach, when we’re grocery shopping, and all kinds of more mundane things.”


The project to create a Visual Experience Database will benefit Bates students “at all levels,” says Michelle Greene, an assistant professor of neuroscience at Bates who is the principal investigator for the project.

Students will benefit from the many research questions that the VED will engender. “A database like this essentially means there will be thesis projects for decades to come,” said Greene. “There are many basic questions that we haven’t been able to answer because we haven’t had the data.”

She said that Bates’ being a liberal arts college will enrich the VED project in distinctive ways. “One is our focus on equity and inclusion. We’re trying to get a diverse set of visual experiences to catalog in the VED, and I think that holding that in the forefront is something a liberal arts college can do that might be a somewhat harder sell” at other types of institutions.

Innovation in artificial intelligence, in particular, stands to benefit from the VED.

Bates’ intimate scale, coupled with the liberal arts approach to education, “allows us to engage in some multidisciplinary and interdisciplinary thinking,” she added. “And one of the things I particularly love about Bates is that I engage in conversations with faculty members across the college — if I were at a larger institution, we probably wouldn’t touch paths and learn from one another.

“So I’m particularly excited about the ways in which this type of data can be used across disciplines.”

Innovation in artificial intelligence, in particular, stands to benefit from the VED. Model systems in computer vision “are very data-hungry,” said Greene. “They tend to require tens of millions of images, and have been downloading these tens of millions of images from the internet. We will now give them tens of millions of images that are more representative of daily-life experience.”

The VED will be a public resource. “This is taxpayer-supported,” Greene said, which means that it should be publicly accessible, both for the sake of transparency and simply because it should be a public good.

She added, “We see throughout digital life that when a resource is available, people appropriate it in really interesting ways. I’m hoping that there may be artists, digital historians, computational sociologists that might want to use this database. And, as such, it should be available to everybody.”


Michelle Greene explains why it is important and valuable for society that the Visual Experience Database will be publicly available.

The VED grant was made through the NSF’s Research Infrastructure Improvement Program, part of the Experimental Program to Stimulate Competitive Research. These initiatives are designed to build research capabilities in underserved regions of the country and thereby make those regions more competitive in seeking other federal R&D funding.

The project team will release a suite of software tools for using the database. The team will also establish a program of “Big Data Skills Summer Workshops” to give students basic programming and computational literacy skills that will not only support their contributions to this project, but help prepare them for a variety of STEM occupations.

“If we can take the next generation of students and get them the best skills, the kind of experience I wish I’d had as a student early on, that is a key part of the workforce development component of the grant,” Greene said.