Learn to think critically about machine learning | MIT News
Students in MIT 6.036 (Introduction to Machine Learning) study the principles behind powerful models that help doctors diagnose disease or help recruiters screen job candidates.
Now, thanks to the Social and Ethical Responsibilities of Computing (SERC) framework, these students will also pause to consider the implications of these artificial intelligence tools, which sometimes come with their share of unforeseen consequences.
Last winter, a team of SERC Fellows worked with instructor Leslie Kaelbling, Panasonic Professor of Computer Science and Engineering, and the 6,036 teaching assistants to infuse the weekly labs with material covering ethical computing. , data and model biases, and fairness in machine learning. The process was started in the fall of 2019 by Jacob Andreas, X Consortium Assistant Professor in the Department of Electrical Engineering and Computer Science. SERC fellows collaborate in multidisciplinary teams to help postdocs and professors develop new course materials.
Because 6.036 is such a large course, more than 500 students enrolled in Spring 2021 have tackled these ethical dimensions alongside their efforts to learn new computing techniques. For some, this may have been their first experience of critically thinking in an academic setting about the potential negative impacts of machine learning.
SERC Fellows assessed each lab to develop real-life examples and ethics-related questions to fit this week’s material. Each brought a different set of tools. Serena Booth is a graduate student in the Interactive Robotics group at the Computer Science and Artificial Intelligence Laboratory (CSAIL). Marion Boulicault was a graduate student in the Department of Linguistics and Philosophy and is now a postdoctoral fellow at MIT Schwarzman College of Computing, where the SERC is based. And Rodrigo Ochigame was a graduate student in the History, Anthropology and Science, Technology and Society (HASTS) program and is now an assistant professor at Leiden University in the Netherlands. They worked closely with Teaching Assistant Dheekshita Kumar, MEng ’21, who was instrumental in developing the course material.
They brainstormed and iterated on each lab, while working closely with the teaching assistants to ensure the content matched and advanced the core learning objectives of the course. At the same time, they helped aides determine how best to present material and lead conversations on topics with social implications, such as race, gender, and surveillance.
“In a class like 6.036, we are dealing with 500 people who are not there to learn ethics. They think they are there to learn the inner workings of machine learning, like loss functions, activation functions, etc. We have this challenge of trying to get these students to really participate in these discussions in a very active and engaged way. We did this by tying the social issues very closely to the technical content,” says Booth.
For example, in a lab on how to represent input features for a machine learning model, they introduced different definitions of fairness, asked students to consider the pros and cons of each definition, and then challenged them to think about what characteristics should be captured in a model to make it fair.
Four labs have now been published on MIT OpenCourseWare. A new team of SERC Fellows revises the remaining eight, based on feedback from instructors and students, emphasizing learning objectives, filling in gaps and highlighting important concepts.
An intentional approach
The efforts of 6,036 students show how SERC aims to work with faculty in a way that works for them, says Julie Shah, SERC associate dean and professor of aeronautics and astronautics. They adapted the SERC process due to the unique nature of this broad course and tight time constraints.
The SERC was established over two years ago through the MIT Schwarzman College of Computing as part of an intentional approach to bring together faculty from divergent disciplines in a collaborative framework to co-create and launch new materials courses focused on social and responsible computing.
Each semester, the SERC team invites a dozen faculty members to join an action group dedicated to developing new curriculum materials (there are several SERC action groups, each with a different mission). They are determined who they invite and seek to include faculty members who are likely to form successful partnerships in smaller subgroups, says David Kaiser, associate dean of the SERC, professor of history of science at Germeshausen and professor of physical.
These sub-groups of two or three faculty members sharpen their common interest over the term to develop new ethics-related materials. But rather than one discipline serving another, the process is a two-way street; every faculty member brings new material to their class, Shah says. Professors are drawn to action groups at MIT’s five schools.
“Part of that involves stepping outside of your normal disciplinary boundaries and building a language, then trusting and collaborating with someone new outside of your normal circles. That’s why I think our intentional approach has been so successful. It’s great to pilot gear and bring new things to your course, but relationship building is key. That makes it something valuable to everyone,” she says.
To have an impact
Over the past two years, Shah and Kaiser have been impressed with the energy and enthusiasm surrounding these efforts.
They’ve worked with about 80 faculty members since the program began, and more than 2,100 students have taken courses that included new SERC content in the past year alone. These students are not necessarily all engineers – approximately 500 have been exposed to SERC content through courses offered at the School of Humanities, Arts and Social Sciences, the Sloan School of Management and the School of Architecture and Planning.
At the heart of SERC is the principle that ethics and social responsibility in computing should be integrated into all areas of instruction at MIT, so that they become just as relevant as the technical portions of the curriculum, says Shah. Technology, and AI in particular, now touches almost every industry, so students of all disciplines need training that helps them understand these tools and think deeply about their power and pitfalls.
“It’s not someone else’s job to figure out why or what happens when things go wrong. It’s all of our responsibility and we can all be equipped to do it. Let’s get used to this. Let’s build that muscle to be able to pause and ask those tough questions, even if we can’t identify a single answer at the end of a set of problems,” Kaiser says.
For the three SERC fellows, it was particularly difficult to carefully frame ethical questions when there was no answer key to refer to. But thinking deeply about these thorny issues has also helped Booth, Boulicault, and Ochigame learn, grow, and see the world through the lens of other disciplines.
They hope 6,036 undergraduate students and teaching assistants will take these important lessons to heart and into their future careers.
“I was inspired and energized by this process, and I learned a lot, not only about the technical material, but also about what you can accomplish when you collaborate across disciplines. Just the sheer scale of this effort was exciting. If we have this cohort of 500 students going out into the world with a better understanding of how to think about these kinds of issues, I feel like we could really make a difference,” Boulicault says.