LIGO prototype
Developed by Professor Emeritus Rainer Weiss ’55, PhD ’62, and his college students, this Seventies prototype led to the Laser Interferometer Gravitational-wave Observatory (LIGO), a large-scale physics experiment that was finally capable of detect the gravitational waves predicted by Einstein’s General Theory of Relativity. The work earned Weiss the 2017 Nobel Prize in physics.
“The experiments that LIGO was able to facilitate feel like magic to me, as a non-physicist,” Nuñez says. “Can you imagine what it was like to be there when they found out it worked? What an amazing moment for humanity!”
Kismet
One of first social robots designed to simulate social interactions, Kismet was created within the Nineties by Cynthia Breazeal, SM ’93, ScD ’00, who’s now MIT’s dean for digital studying and head of the Personal Robots Research Group on the MIT Media Lab. Originally managed by 15 totally different computer systems, Kismet employed 21 motors to create facial expressions and physique postures.
“I have a lot of affinity for that particular artifact,” says Nuñez, who studied with Breazeal on the Media Lab. “It’s such a charismatic object; it’s one of the museum’s Instagram moments.”
IRGO
Developed by Julie Shah ’04, SM ’06, PhD ’11, IRGO is an interactive robotic that museum guests may help to coach by artificial-intelligence demonstrations. “Our visitors are participating in real robotics research,” Nuñez says. “That is such a rare and special opportunity.”
Today Shah is the H.N. Slater Professor in Aeronautics and Astronautics at MIT and head of the Interactive Robotics Group inside the Computer Science and Artificial Intelligence Laboratory. She shares her ideas on AI in a close-by audio gallery. Other alumni featured in that gallery embody Professor Rosalind Picard, SM ’86, ScD ’91, director of the Media Lab’s Affective Computing Research Group, and Media Lab PhD college students Matt Groh, SM ’19, and Pat Pataranutaporn, SM ’20.
“We want to be able to expose the fact that there are communities of people behind everything you’re seeing,” Nuñez says.
Coded gaze
Visitors to the AI gallery can see the masks utilized by Joy Buolamwini, SM ’17, PhD ’22, to current a white face—quite than her personal Black one—to facial recognition software program, which she discovered was much less correct for folks with darkish pores and skin. In her doctoral thesis, Buolamwini coined the time period “coded gaze” to explain algorithmic bias.