The first dean of the University of Wisconsin-Madison’s new College of Computing and Artificial Intelligence says the institution will be a vital non-industry leader shaping the future of AI.
The college is set to open July 1, and it’s the first new academic division in 43 years. Last week, the university named the new college’s founding dean: Remzi Arpaci-Dusseau, the current director of the university’s School of Computer, Data and Information Sciences and special advisor to the provost for computing. Arpaci-Dusseau will guide the launch of the college and oversee its growth and academic direction.
He said unlike the tech industry voices currently directing the future of AI, the college can work to prioritize ethics and the public good. He expects the college to include perspectives that are both supportive and critical of AI.
Understanding Wisconsin, Together.
WPR’s “Wisconsin Today” newsletter keeps you connected to the state you love without feeling overwhelmed. No paywall. No agenda. No corporate filter.
“We’re not here to be AI cheerleaders,” Arpaci-Dusseau told WPR’s “Wisconsin Today.” “A university is here to be an educated, thoughtful leader in these very challenging spaces.”
Arpaci-Dusseau joined “Wisconsin Today” for a look at AI ethics, data centers, cross-college collaboration, the job market and more.
This article has been edited for brevity and clarity.
Kate Archer Kent: How do you plan for this college to interact with the rest of the university?
Remzi Arpaci-Dusseau: There’s a research side of that, and there’s an educational side of that. The research side is through partnerships. For example, we’re about to launch a new interaction on health and AI and the future of research and education, in that space between us and the School of Medicine and Public Health. That’s such a great example of the type of possibilities that are there because so much is changing about what it means to be a person working in health tech. What does it mean to be an MD in the future? How much do you have to know about the world of AI to be a successful doctor in the future? We’re going to address that directly.
It also has an educational bent. We’re hoping that, as we develop new classes, we think about not just classes for our students … but we also hope to be in partnership with all other colleges as a center point for education in what students need to know today that’s going to help them when they leave here to enter the workforce. That’ll probably start with a class or two and eventually be a certificate. I’m hoping we can do a lot of dual majors and dual degrees across all the other colleges, and I think we’ll be really well-positioned to do that.
KAK: What is your approach to teaching ethical issues in AI?
RAD: (This is) exactly why a university has to have a strong unit that is at the center of this world. The world is changing and a lot of it seems … like the industry is shaping this. Where’s the voice of something other than the tech industry? I think a university can be a lead in that.
I’m a computer systems expert, so my expertise is not in ethics, and there’s people in our college that have that expertise. And of course, we’ll be working with others on campus that have expertise, (like) the Department of Philosophy. There’s a lot of places where we’ll work together to think about what kind of ethical training students should have here.
It’s one thing to have a class, and it’s quite another thing to make the hard decisions when you’re faced with an ethical conundrum in the real world. So what we can hope to do here is expose students to these ideas and thoughts in class — and as much as we can in real-world situations. Like much of campus, we’ll be doing more experiential learning, where you work in more industry settings, so you see real problems. Hopefully we can give students enough courage of their own conviction, so when they leave here and they’re faced with a tough ethical choice, that they can know what the right thing to do is.
KAK: We are seeing communities across Wisconsin pushing back against these AI data centers. The university exists to serve this state. How do you reconcile encouraging all of this investment in AI with the concerns you see coming out of communities big and small in Wisconsin?
RAD: We’re not here to be AI cheerleaders. A university is here to be an educated, thoughtful leader in these very challenging spaces. The one thing we’re doing is we’re not burying our heads in the sand. We’re making sure that we have a strong unit that has people in it that are thinking about building the technology as well as the critique of the technology, and working with others — like the La Follette School of Public Affairs — on shaping public policy. The amazing thing about this campus is that we do everything. We’ll work with the law school. What should the future laws look like in this space?
We’re trying to make sure that when we enter those discussions, whether you’re the biggest proponent who likes to build the technology and use it for things, or you’re a person who’s critiquing it, that there’s a baseline of understanding and education. I think that’s what our role can be: to bring light and clarity to these issues.
If I’m a community and a Google or a Meta or OpenAI comes in and wants to put a data center there, I would be concerned. Because I think it’s very easy to have a short-term perspective on the investment happening here, but clearly there could be a long term cost. And I think being cognizant of that is smart. And I think we should make sure we’re going slowly enough and deliberately enough that when communities make those decisions, it’s in a way that is good short term, but more importantly that it’s good long term. What we can hopefully do is be some point of light on that (issue) that isn’t just coming from industry.
The college wants to hire 50 faculty. How are you going to attract that much Silicon Valley-tier talent?
RAD: We can’t compete with Silicon Valley salaries. Our Ph.D. students, for example, make more than any faculty member on this campus does on day one. It’s completely insane, honestly. So I don’t think that’s how we’re setting up. It’s not a competition we’re trying to enter into or win.
We want to hire people here who understand what it is to be a professor, a researcher, a teacher, a mentor. And there’s a lot of people who, myself included … have not followed that (industry) path because it wasn’t what I was interested in. I’m so much more energized by being around students and by being in an intellectual environment.
I’m not unique in that regard. I think there’s many people that are attracted to this, and I think particularly now, there’s a lot of money that people can make out in industry, but there’s plenty of people that really want to do something other than that. They want to come here and be part of this mission. And I think what’s great about the new college is that it will attract a lot of those people, because they see the opportunity to really think about (how to) help shape the significant technological change going on.
KAK: Graduates are facing the toughest job market since the COVID pandemic. At the same time, we’re seeing these entry-level coding and data roles being automated. What is the outlook like for AI and computing jobs straight out of college?
RAD: I was out in Silicon Valley a few weeks ago. A lot of our alums end up in Wisconsin, but plenty end up out there too. I went from company to company. Google, Meta, Snowflake, OpenAI — everybody’s hiring. And plenty of them hire junior people. … Every company I went to, they said people need to lead this work.
I was talking to a distinguished engineer at one of these companies and he was talking about how the way we write code has changed, but what hasn’t changed is that it needs human judgment. The center of designing code today is you work with an (AI) agent, but you’re the chief architect. The agent can help you, but if you just go to the agent and say, “Do some stuff,” it’s garbage in, garbage out. It’s a lot smarter than you think, and it’s easier today to build a simple app than it was five years ago. But at 100 percent of the companies I talked to, people are guiding these processes.






