Colorful illustration of students and teachers working with AI
By Tomas Weber Illustrations by Alex Eben Meyer

Faculty across disciplines are diving into the exciting, data-crunching, AI world of GPMoo.

 

Katie Keith flicks a switch, and a thunderous roar, like an airplane taking off, fills a small, dark, windowless room in Jesup Hall. The noise is the sound of Williams entering a new era of human and machine interaction. For the first time, a rack of state-of-the-art, artificial intelligence (AI)-optimized computer servers, called a GPU cluster, is on campus, online and ready for action.

The powerful new servers cost more than $100,000 each and are more commonly encountered at large research institutions, not liberal arts colleges. But four have been whirring away in the college’s server room since Thanksgiving.

The acquisition was funded by a nearly $745,000 grant from the Massachusetts Life Sciences Center—with an additional $83,200 from Williams—that Keith, a computer science professor, and Victor Cazares, a psychology professor, helped secure.

The GPU cluster, which the computer science faculty have affectionately named “GPMoo,” is already supercharging research and education on campus while enabling fresh collaborations.

“Over the last few years, AI has changed everything in our field,” says Jeannie Albrecht, department chair and professor of computer science. The servers are “a co-op for computing,” she says. “We are trying to support data analysis across disciplines.”

And the college has been adapting rapidly. The new servers follow the hiring over the last four years of three new department faculty with AI specialties: Keith, Mark Hopkins and Rohit Bhattacharya, who, along with computer science professor Stephen Freund and psychology professor Noah Sandstrom, helped land the tech grants.

The arrival of the GPU cluster has generated excitement on campus. Keith and Freund are fielding faculty requests to use the cluster for research projects from many disciplines, including economics, geosciences and psychology. Meanwhile, the new technology has fueled discussions about everything from the potential environmental impact to the ethics of AI.

Massive AI data centers have raised concerns over their water consumption, electronic waste and energy use, among other environmental impacts. However, GPMoo “uses about 30% less electricity overall than the previous servers, helping the college to reduce its greenhouse gas emissions,” says Tanja Srebotnjak, executive director of the college’s Zilkha Center for the Environment. Additionally, she says, “the college matches the cluster’s electricity use with renewable energy certificates, called RECs, as part of its overall ‘zeroing out’ of grid electricity-related emissions.”

In terms of the ethical implications, Keith points out that students’ hands-on experience with the GPU servers has increased their awareness of AI’s potential social impact—both for good and for ill. Together they explore questions such as “What do we want to do with AI?” “How do we regulate it?” and “How do we educate people about it?” To respond to the questions in an informed way, Keith says, “You need to understand how AI works, so you’re not relying on folk wisdom.”

To that end, the college’s Rice Center for Teaching has been leading community conversations and workshops focused on AI, including how tools like ChatGPT present both challenges and opportunities for teaching and learning at Williams.

“Our goal is to empower faculty and students to adapt to our rapidly changing school and work settings with the necessary skills,” says biology professor Matt Carter, the Rice Center’s faculty director. “By educating ourselves on the ethical and equity-focused concerns of teaching with AI, we can encourage students and faculty alike to do their work with integrity.”

The idea of acquiring this new technology began with a dog walk in the woods. Last spring, on one of their regular strolls near campus, Cazares was telling Keith about his need for more computing power to support his research investigating how brains encode memories. The work requires minimally invasive surgery to insert tiny cameras into the heads of mice in order to capture video of their neurons firing when exposed to food-associated sounds.

“Those videos end up turning into ungodly dense spreadsheets full of numbers,” Cazares says. And he knew that processing and analyzing that amount of data would be well beyond the capabilities of the college’s computer servers at the time, called CPUs.

Illustration of mice on building blocks

Keith, too, is familiar with the limitations of traditional computing. Working in social data science, she designs automated-language-processing tools and specializes in turning unruly datasets into lucid insights about human behavior. In 2023, she was part of a team of data scientists that input four decades of U.S. Supreme Court transcripts into a machine-learning model and discovered that justices were far more likely to interrupt female attorneys. Having experience working with big data, she knew that a GPU cluster is exactly what Cazares required.

CPUs are designed to handle multiple jobs at a slow pace, but GPUs are engineered to perform one task extremely quickly. They also can train an AI model—a process that involves feeding the model curated datasets to refine the accuracy of its output—10 times more quickly than a CPU.

Keith began teaching with GPMoo immediately. Students in her course “Natural Language Processing” use the cluster to explore the history of techniques that have allowed computers to interpret language, from the very beginnings of natural language processing—a subset of AI that gives computers the ability to interpret human language—to the release of ChatGPT.

They also can work on an open-ended project of their choice, with most students opting to use GPMoo. “While we could teach students the theory behind the models and give them a sense of how the computation works, we were previously limited in the size of the datasets that they could work with,”Albrecht says. “Not anymore.”

Cazares also is using the new servers, along with sophisticated brain-imaging equipment purchased with the grant funds, in his research on neural activity in rodents, material that he teaches in his upper-level neuroscience course. “We can use machine learning to find patterns in the data that are elusive to a human brain because they are too multidimensional,” he says. “It allows us to make predictions and develop models that are just at the limit of our human capacities.”

With the help of student researcher Suzanne Penna ’25, Cazares can now observe mouse brains as they form memories in real time. The next step will be feeding all the data into an AI model powered by GPMoo to uncover patterns in neural activity. “We’re watching these neurons flash,” Cazares says. “It’s a technique I’ve been chasing my entire career. It is so special to see it happen.”

Last year, Bhattacharya and Keith, together with recent graduate Jacob Chen ’23.5, now a Ph.D. student in computer science at Johns Hopkins University, joined forces to work on a machine-learning project based around natural language processing.

For one of the servers’ first tasks, the team built an AI model to identify causal patterns in clinical research using the example of a new clot-busting drug. Testing its efficacy requires comparing the health outcomes of those who received the medication with those who did not. But the drugs’ efficacy is affected by irregular heart rhythms, and the dataset contained no information about which patients suffer from this condition.

Illustration of a student and teacher collaborating on a project

These sorts of “confounding variables” are a common problem in clinical research. Keith, Bhattacharya and Chen were able to address them by feeding thousands of anonymized electronic health records—a “huge, untapped source of data,” Keith says—into a new kind of AI model that they developed, powered by the servers.

The team found that their model, which uses clinical notes to make a prediction about whether a person might suffer from an otherwise unrecorded medical condition, had the potential to evaluate the efficacy of new drugs better than other widely used methods. In 2024, they presented their paper at NeurIPS, one of the most important AI conferences in the world.

Students, too, are now harnessing the cluster’s power to conduct research. For his honors thesis, computer science major Matt Laws ’25 is building an AI model to predict if mutations in specific proteins increase disease risk. It is a project that is “far too complex to run locally on a laptop, and existing lab machines don’t possess the GPU power to do it, either,” says Laws, who plans to enroll in a Ph.D. program in the fall to study system security and machine learning.

For students like Laws who aim to work in tech, leaving college with hands-on experience of a GPU cluster is invaluable. “It’s a big part of what jobs are looking for these days,” says Albrecht.

Adds Laws: “Having a lot of experience working on GPMoo has prepared me incredibly well for the next steps of my career.”