IE11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

San Diego Area Colleges Band Together on AI

The San Diego Community College District, San Diego State University and the University of California at San Diego are combining the resources of large institutions with diverse insights of smaller ones in the Equitable AI Alliance.

A layout of a brain formed by blue lines with one side looking like a computer chip to indicate artificial intelligence. Gradient blue and black background.
In San Diego, students at the university level are talking about artificial intelligence.

The University of California at San Diego (UCSD) used its supercomputing center and Meta's large language model, Llama, to create a campuswide secure chatbot for students called TritonGPT. At San Diego State University (SDSU), a robust survey for students on their use and opinions of artificial intelligence gathered more than 7,000 responses in 2023 and more than 10,000 in 2024. Those response rates, 21 percent and 26 percent, respectively, blew typical student response rates of 2 or 3 percent out of the water.

At the San Diego Community College District (SDCCD), where such costly, sweeping efforts are not possible, chief information officers and other administrators are working to bring AI into conversations throughout the campus community in other ways. Leading a session at ASU+GSV's AI Show last weekend, a panel of experts said collaboration has always been a key component of progress in higher education, but it will become even more important as AI evolves quickly and increasingly impacts students' lives.

“It’s clear that there are haves and have-nots in terms of the higher education landscape, and I think it's really important for the people who have the resources to share those resources with the community, writ large,” SDSU CIO James Frazee said.

UNDERSTANDING CAMPUS NEEDS


The three institutions — UCSD, SDSU and SDCCD — came together in 2024 to form the Equitable AI Alliance, in which they work together to negotiate with AI vendors, bring accessible tools to their campuses and foster collaboration.

According to Frazee, understanding the needs of each campus community was an important starting point. In 2023, SDSU welcomed its first two faculty fellows in AI, and the campuswide AI survey was a result of their work.

In another session at the AI Show specifically focused on SDSU’s efforts to bring AI to their campus, faculty members discussed some key findings from a survey that were published in a publicly available dashboard.

From 2023 to 2024, student use of ChatGPT rose from 45 percent to 82 percent, and the proportion of students using AI in their studies rose from 38 percent to 48 percent, the survey found. It also found that more than half of students in 2023 said their curriculum lacked adequate exposure to AI, a number that dropped in 2024.

Elisa Sobo, an AI faculty fellow in the anthropology department at SDSU, emphasized that the survey also includes open-ended questions to put the data into context. For example, the survey found that female students were more likely to say that AI is too complicated for them to grasp than male students were.

“You could jump to the conclusion that that means we need to do more outreach to the females because they don't get it,” she said. “It could also be that the male students have been brought up to be more boastful or to be more overconfident about their abilities, so it could be that those are the people who need more help.”

Other institutions, including SDCCD, have used the SDSU survey, housed on Qualtrics, to collect data from their own communities, and the California State University system plans to use the framework to conduct a systemwide survey in the future.

PUTTING DATA INTO ACTION


Institutions have used data from these surveys to make changes and meet the needs of students and faculty.

For example, SDSU students expressed confusion with expectations from their instructors on when AI use is appropriate. In response, SDSU now uses a tiered system to indicate what kind of AI use is permitted on assignments: Level one means disallowed, or no AI use at all; level two is restricted; level three requires tracking and documenting AI use; and level four is unrestricted.

“The important part of this is communicating not just what or how, but also why,” Sobo said. “Why is it appropriate, or why is it not appropriate?”

The survey also showed that students and faculty want comprehensive training, so SDSU created a micro-credentialing program. The program is free to engage with, though the badges for completing the courses cost $25 and are tailored to students or faculty and staff.

Access to these credential programs has been helpful for SDCCD.

“James was talking about this credential that they had, and we were looking at creating a credential as well, starting with faculty. I said, ‘Is this something we can have access to?’” said Michelle Fischthal, vice chancellor of institutional innovation and effectiveness at SDCCD. “Because I recognized we didn’t have the resources to move as quickly as we needed to.”

Once students and faculty know how to use AI, panelists said, partnerships can help ensure campuses have access to tools with proper safety protections. The California State University system worked with tech companies to provide ChatGPT Edu to its campuses, for example.

“This is a profound change that's going to require a complete reimagining of the curriculum and a retooling of our pedagogy, which is a lot of work,” Frazee said. “So we really need to band together.”

This article first appeared in Government Technology, sister publication to Industry Insider — California.
Abby Sourwine is a staff writer for the Center for Digital Education. She has a bachelor's degree in journalism from the University of Oregon and worked in local news before joining the e.Republic team. She is currently located in San Diego, California.