CIO Interviews

Ep 63: AI Becomes a Campus-Wide Builder with Arizona State University CIO Lev Gonick

Guest Michael Keithley
Lev Gonick
February 25, 2026
25
 MIN
Listen to this episode on your favorite platform
Apple Podcast Icon - Radio Webflow TemplateSpotify Icon- Radio Webflow TemplateApple Podcast Icon - Radio Webflow Template
Ep 63: AI Becomes a Campus-Wide Builder with Arizona State University CIO Lev Gonick
CIO Interviews
February 25, 2026
25
 MIN

Ep 63: AI Becomes a Campus-Wide Builder with Arizona State University CIO Lev Gonick

On the 63rd episode of Enterprise AI Innovators, Lev Gonick, Chief Information Officer at Arizona State University, joins hosts Evan Reiser (co-founder and CEO, Abnormal AI) and Saam Motamedi (General Partner, Greylock Partners) to explain how ASU turned generative AI from a pilot into an internal platform and operating model, including an AI acceleration team, internal grant cycles, and guardrails designed for tens of thousands of users.

On the 63rd episode of Enterprise AI Innovators, hosts Evan Reiser (co-founder and CEO, Abnormal AI) and Saam Motamedi (General Partner, Greylock Partners) talk with Lev Gonick, Chief Information Officer at Arizona State University (ASU). ASU treats AI like a product and a platform: build internal capacity, run structured grant cycles to surface real workflows, and ship a “builder” that can support tens of thousands of users. Gonick also lays out why student agency matters, and why universities that wait for vendors risk becoming “nothing but vendor management.”

Quick hits from Lev:

On building an internal AI acceleration team: “We literally took, initially 20 and now it's 40 people dedicated from morning till night working on platform technology, security, compliance, tooling, building tools to support what we knew that needed to grow up to be, again, a low code, no code kind of environment.”

On scaling demand through structured internal grants: “We thought there be, you know, 40 or 50 great ideas, you know, as we speak today. Now we're through through four full rounds of engagements. We have over 600 projects in-flight, right now.”

On fixing the friction-filled student journey with AI: “Focus in on transforming the ways in which students have to navigate incredibly, difficult, friction-filled, journey. Applying for, getting into, having to do financial aid, landing in a dorm room, [solving] some of the back office through again tools that they're using in their consumer life.”

Recent Book Recommendation: Capitalism: A Global History by Sven Beckert

Episode Transcript

Evan Reiser: Hi there and welcome to Enterprise AI Innovators, a show where top technology executives share how AI is transforming the enterprise. In each episode, guests uncover the real-world applications of AI for improving products and optimizing operations to redefine the customer experience. I’m Evan Reiser, the founder and CEO of Abnormal AI.

Saam Motamedi: And I’m Saam Motamedi, a general partner at Greylock Partners.

Evan: Today on the show, we have Lev Gonick, Chief Information Officer at Arizona State University. ASU operates at scale with 200,000 students and campuses in Arizona, Los Angeles, and Washington, D.C., with 35 locations worldwide. Lev ties the AI push to student success rather than selectivity. 

Here are three things that stuck with me.

First, Lev did not wait for the market to settle. He built an acceleration team inside enterprise IT. Now 40 people are focused on platform security, compliance, and tooling alongside their 2024 OpenAI partnership. They built Create AI, a low-code/no-code builder to scale delivery. He’s blunt: don’t wait or you’ll end up doing vendor management. 

Second, you treat adoption like a pipeline. Grant rounds expected dozens of ideas and turned them into over 600 projects. It spread from faculty to staff and students, including persona-based healthcare efforts with avatars and tools. They publish best hits so others can copy what works. 

And finally, Lev calls this an antifragile moment. Many schools fixate on stopping students from using these tools, but ASU leans in, rethinks assessment, and focuses on student success.

Well, Lev, thank you so much for joining the show today. Do you mind just starting off by sharing a little bit about your career and maybe your current role at ASU?

Lev Gonick: First of all, Evan, great to be with you. And Saam, excited that there’s actually a conversation to be had—to be talking about real work in AI at an enterprise having impact. It’s a refreshing opportunity. So thanks for having me.

ASU is a large university, but unlike one probably you or any of your viewers, listeners, really have ever experienced before. It’s a university at significant scale—almost 200,000 students. We are operating not only here in Tempe, Arizona, but we have campuses in LA. We have campuses in Washington, D.C. We support 35 campuses around the world. We’re also powering education operations for the U.S. Air Force, for the Army. We’re engaged in countless activities to support other universities, colleges, in the use of our technology.

And all of that’s made possible because of our tech stack and the way that we’ve actually built up ASU itself in terms of its technology. And all of that is guided by our ASU Charter, which is our kind of constitutional document, which really focuses in on our students and their success.

Our orientation is an inclusive orientation, not a sort of exclusive one, where we measure ourselves actually by our student success rather than by how few people we actually let into the university. And that is really focusing in on trying to prepare the workforce for the 21st century for the United States, and for those who want to contribute to their communities as well.

So it’s a great place to be at. It’s an opportunity for technology—especially for a CIO—to be able to really contribute the full force and the full portfolio of everything we do, we can do, to advance this really important charter and have an impact not only on ASU but the communities around us.

And obviously, we serve at a global scale.

Evan: When we think about AI, how does that fit into the technology philosophy? What role do you think it’s going to play in helping you advance that mission?

Lev: AI at ASU from its inception has really focused in on the ways in which we can mobilize the knowledge core of this huge research university with almost 6,000 faculty who are solving all kinds of important issues—whether it’s cancer research discovery, or archaeological work, or working on the large language models.

From the outset, while the world woke up three years ago to understand there was something called generative AI, as you both know well, this has been going on for 70 years if you want to call the origins, but certainly for the better part of ten years of really important work that’s been going on.

To answer your question: first and foremost, we are focused in on impact. That’s not just for the student experience, which I’ve already shared with you. It’s also to support the engine of this research enterprise that we have here—about $350 million of sponsored research in AI-related activities. And again, that’s everything from energy to water to core machine learning technologies, visualization technologies, and the like.

So we support, along with our colleagues in our research operations, AI for the core research about AI, and then, of course, the application of AI for the research enterprise here.

The teaching and learning part of it is a huge bet that we’re putting down—that we’re going to continue to be able to scale here at ASU and have social impact by the ways in which we apply AI through principled innovation.

And our thesis here is not just that it’s important to realize some of the promise of the efficiencies that everyone at the enterprise level is focusing in on—and I can share some of that with you—but our overall goal is a thesis that says: if we take the ASU Charter seriously, we want our students to succeed. That’s the measure.

So what do we think success looks like in the AI era? Whether you’re in tech, or in some other company, or whether you’re in service for your community or a public sector job—what does it actually mean to be successful?

And our objective right now is to make sure that our students understand that they have agency—the autonomy to actually participate themselves in the ways in which AI gets utilized in their school and the ways that they think about the big challenges that they want to have an impact on.

Evan: A lot of people talk about first principles thinking. You can really see it even in your philosophy. You’re working backwards from clear outcomes: how do we make people successful?

And I think one thing that’s really refreshing to hear, Lev, is it’s very hard in older organizations, with a lot of historic culture, to adapt and change so quickly.

Lev: Hopefully we’ll get a chance to chat a little bit about some of the enterprise tools that we’ve built internally that support tens of thousands of users and experiences that aren’t reliant on the old.

And it’s still hard to get stuff out of the central IT organization because the demand far exceeds capacity to deliver. But as you unlock this value, this is a really exciting way for us to support the institution, its mission, but also give students this sort of leg up in a very competitive world.

Evan: That I really love. And to any students who are listening: there aren’t a lot of very experienced engineers actually using applied AI. There’s never been more competitive advantage as a new student. You actually might be ahead of people who have been in software engineering for ten years.

And I also think some of the skills you’re teaching your students—it’s not about the technology. Like you said: first principles thinking, high agency, leadership. Those are things that are so vital as someone that hires hundreds of software engineers.

Saam: One of the reasons I’ve been excited to have you on is I think ASU was the first higher education university to partner with OpenAI. I was doing some research ahead of the show, I think in early 2024. So like three years ago, before I think most of your peers even really understood what AI was. And so you’re a bit ahead of the curve on this, and there’s lots of different ways it’s manifested.

Lev: And you’ve talked about some of them already. I’m curious a little bit about under the hood—if we open up the hood—what are some of the enterprise AI tools that are powering the work actually happening at ASU? And are there a couple of interesting examples you want to talk about for our audience that they might find interesting to hear about?

I’ll start with the first principle: it is still really early days. For the enterprise, the first thing that a lot of my business stakeholders came to me with is: who are we putting a bet on?

Even though we began in early ’24 working with OpenAI, I was on their doorstep literally nine months earlier with the idea that we could help shape what could happen in higher education by partnering together—their lab and ASU.

But at the same moment, we absolutely took it at first principle, and we started developing our own tools. We not only started developing the tools, but we also committed to storytelling along that journey.

So I took the initiative, along with my colleagues here, to build an AI acceleration team inside enterprise IT here at ASU. We took initially 20—and now it’s 40—people dedicated from morning till night working on platform technology, security, compliance, tooling—building tools to support what we knew needed to grow up to be a low-code/no-code kind of environment.

We started both with our partnership with OpenAI and with our own internal work, which has now been branded as Create AI—the Create AI platform and the Create AI builder.

These were being developed in parallel, along with tons of experimentation from the Anthropics and the Llamas of the world, and everybody else finding a place to play here at ASU as a petri dish early on.

But the formal programs we started were internal grant programs where, early on, we were providing licensing and engineering and developer support for internal competitions on who had big ideas that could really have impact on the core challenges—whether they were in the research lab or in the teaching and learning environment.

And we thought there’d be 40 or 50 great ideas. As we speak today, now we’re through four full rounds of engagements. We have over 600 projects in-flight right now. And these are the response from the campus community.

It started off very faculty-centered. It’s staff-centered now. It’s student-centered activities. We’re doing persona-based engagement with our healthcare professionals. Now we’ve got avatars that are fully interactive with both generative and multimodal tools.

And all of those stories are captured on an ASU website, and we’ve got three compilations already. So we’ve got our best hits. We’ll have another one out in a couple of months.

That’s part of why we think it’s important to lead internally, but to share. Because I agree with Evan: this is the moment. This is an opportunity for education to serve as the launchpad for the art of the possible by giving all of these tools and opportunities to our students.

And to be clear, there are all kinds of important guardrails. Principled innovation for us is not a throwaway line. It’s something we take very seriously.

Evan: If you were to design a new university today with AI at its foundation—an AI-native university—not just to run AI-native, but also to help students become AI natives—what would that look like different today?

Lev: There are all kinds of interesting experiments underway here at ASU to build out an AI-native digital experience end-to-end without the legacy. You’re exactly right to put your finger on it.

We’ve done that at ASU in the past. And I think what we do is we borrow from our heritage. It’s relatively short compared to most of the great universities in this country, but as the New American University, which is our brand, we understand the need to borrow from what I think the best companies realize: when you need to pivot, you need to plant a seedling outside of the mothership—to mix metaphors—to begin to cultivate.

Back during 2000 and 2008/2009, we had the real estate implosion, the banking implosion in the United States. Here in Arizona, it was an existential moment when it came to public education, among many other things.

ASU at that very moment decided to stand up a unit that started being called ASU Online way before anybody else was putting all marbles in on online. And thanks to great leadership by President Crow, by Phil “Rick” Year, that unit is now called EdPlus. It offers over 100,000 students over 350 degrees that are available—designed, taught, and evaluated by our faculty here at ASU.

So it’s not a diploma mill, and it’s definitely not a private entity grafted onto a public university.

So what do we learn from that—apropos your question? There are similar kinds of activities underway. I’ll keep most of them to myself because it is a bit of a market advantage, because we are already leaning into those opportunities. Some of them are here in the U.S. Some of them are global—that is, ASU engaged in global conversations on an AI-native, AI-digital learning experience.

Evan: I’d be remiss not to underscore a point you made. There’s never been a time where investing in—or developing—your own agency is going to pay off more. So many people are capable of doing so many things.

And I’m sure if you told someone 30 years ago what you guys have been doing, they’d be like, “Sounds like a pipe dream, Lev.” But we’re here. Same thing for entrepreneurs all the time.

Lev: One of the insights we have, which comes from our experience working with folks in the private sector, is that it’s at the moment of these transitions and disruptions and ruptures that if you can pivot with a growth mindset, you’re not only going to be able to survive—which is what most people in higher education think is the goal—but you’re actually going to thrive through what in our literature is called antifragile.

We’re built to be antifragile at ASU. We hear all kinds of clouds darkening with AI disrupting everything, and faculty are consumed with how to stop students from—these are not unimportant issues. However, that’s not the growth mindset.

The growth mindset is to lean into them and figure out: what does it mean to assess and help students succeed in the AI era? Because there is no putting this puppy back anywhere but out for a run. Let’s give it a run.

Evan: We’re going to segue into what we call Lightning Round. It’s a really unfair segment, but we’re going to ask you really hard questions and try to get you to give us the one-tweet version of the answer. So I’m going to kick it off.

Saam: Lev, you’ve talked about—even 600 different ideas or projects. If there were three AI transformation projects—if there were only three that you could recommend to a new university CIO who wants to knock it out of the park in their first year—what are the three?

Lev: Focus on a transformational teaching and learning experience by giving tools to your students and your faculty.

Focus on transforming the ways in which students have to navigate an incredibly difficult, friction-filled journey: applying, getting in, having to do financial aid, landing in a dorm room. Solve some of the back office through tools they’re using in their consumer life. Make those available along the way.

And then I think most importantly: be sure that you’re investing in your own IT capacity. Don’t wait—especially if you’re talking to CIOs in higher education. Don’t wait for the market. That train will have left the station, and you will be left doing nothing but vendor management.

Evan: Yeah, that’s wise advice. To do all this, you’ve got to be up to date on some of the new technology trends. You have to be open-minded about what’s possible. You’re very good at this, and I think the organization overall has been ahead.

What’s your strategy there? What would be your advice to CIOs about how to stay up to date on what’s going on with AI? I don’t mean the best way to train a neural network, but how AI can be applied inside the environment.

Lev: For me, I’m lucky. I get to ask students who are fearless and apparently have embedded extra hours in their day. We hire hundreds of students, and I love—this evening, we’re going to be going out with some of them. For me, my best kind of research is really asking students what they’re working on.

Along the way, I read and listen and watch way too much and share too much of that with colleagues. I think they’re tired of me sharing insights. But it’s critically important not only to do the sharing, but also to see what kind of insights—including pushback—you can get from your colleagues as to what’s real, what’s hype, what’s going to work its way through the hype cycle.

Another piece: we have very active Slack channels here where we are sharing all the time and voting up or down these concepts, which gives us a chance to go deeper—either in research or working with our favorite lab.

Saam: Maybe switch to the more personal side. Is there a recent book you’ve read that you really liked—had a big impact on you? If so, what’s the book?

Lev: I listen to a lot of books. That’s my mode of reading these days. I’m listening right now to a grand, sweeping history of capitalism by a great Scandinavian. I think Sven is probably a Nordic economist, historian.

I love the idea of understanding where we fit in the sweep of human history. Part of that story of capitalism is the ways in which it has evolved—adapted—never standing still, from the mercantile era through industrial capitalism to technology-driven capitalism as we have it.

I think the book has a very unique title. I think it’s called Capitalism.

Evan: This one doesn’t have to be AI, either. What’s an upcoming technology that you’re personally most excited about?

Lev: Right now, I’ve got a bit of an obsession. It’s a rabbit hole I found myself in. I’m interested in different ways of building out networks—beyond optical networking—and trying to understand that in order to support AI.

I’m looking at quantum networking of the future—driving the research agenda, driving the AI experience of the future. We’ve probably got to find a way of reducing latency even further than it currently is able to do. And it’s not just about upping the speeds—it’s about reducing the latencies along the way.

So right now, it’s the technical prerequisites for building, at an enterprise scale, next-generation networks to support quantum and AI. I’m deep into it right now. And I think my network engineers have said, “Just stop already. It’s not going to happen for the next three years.” And I said, “Three years? Not a chance. We’re going to start right now.”

Evan: When I was in school, I studied computer networking—kind of my specialty. And then I never used many of the skills, but some good photonic—

Lev: Photonic networks is the future. Evan, come back and we’ll do that together.

Evan: Okay. Okay. If I get fired from my job, I’m probably calling you up and asking for a role on your team because it sounds like there’s some cool stuff over there. Okay—Saam, why don’t you do the second-to-last question.

Saam: What do you believe about AI’s future impact on the world that most people would consider science fiction today?

Lev: There’s not much science fiction that at least some people don’t think is going to be real. The great news is that science fiction really is guiding the way we’re thinking about the possible.

I was reading as a kid growing up Isaac Asimov, and people said, “Well, we still don’t have flying cars.” But the truth is we’re going to have data centers in orbit any time now. I was just talking with someone at lunch about how real we think that one is. I wouldn’t bet against it.

That one might sound like one of the ones that we see beginning—the physics getting worked out, the engineering beginning to get worked out.

And I think, in general, the great opportunities here are probably in the oceans and in the skies. Those will be some of the really interesting places to be seeing where technology comes from.

I think a million low-orbiting satellites is just the beginning. I think big data centers in the oceans are definitely part of our future as well.

Somehow not all—not because all of this is good and easy. We should be debating this all along the way, but those are some of the frontiers.

Evan: Okay, last question. I know we’re out of time. I’m sure you get asked a lot: what’s your advice you would give to students trying to get into the technology field? What would be your advice for up-and-coming technology leaders?

Lev: Much earlier on in my career, I met Alan Kay, the great American computer scientist out of Utah, who gave us—at his Apple lab—some of the original vision of mobile computing. Alan’s most famous line, which we all use, is: “If you want to predict the future, invent it.”

And my own view to folks that I talk to is: if you want to invent the future, start. The conversation we’ve had today around AI is a really good example—you could keep talking about this forever. My advice is: just start, and be ready to pivot.

Evan: I have like 400 more questions for you. I’m really energized, but unfortunately we’re out of time. Thank you so much for joining us today, and let’s chat again soon.

Lev: All right. Thanks, Evan. Thanks.

Evan: That was Lev Gonick, Chief Information Officer at Arizona State University.

Saam: Thanks for listening to Enterprise AI Innovators. I’m Saam Motamedi, a general partner at Greylock Partners.

Evan: And I’m Evan Reiser, the founder and CEO of Abnormal AI. Please be sure to subscribe so you never miss an episode. Learn more about enterprise AI transformation on the Enterprise Software Blog. 

The show is produced by Abnormal Studios. See you next time!