Register for the UIA National Summit HERE!
Our online University Innovation Lab is now open to institutions nationwide! Find out more>>

Chatbots and AI in Higher Ed

Chatbots and AI in Higher Ed

Engaging Students and Strengthening Institutions

The University Innovation Alliance (UIA) was honored to partner with The Chronicle of Higher Education for its January 31, 2023 discussion engaging students through chatbots and artificial intelligence (AI). The panel explored what's possible with the technology and shared success stories. Panelists included:


Myths of AI in Higher Ed
Artificial intelligence is a rapidly emerging category of tools, including chatbots for answering student questions and ChatGPT for generating language that mimics expository writing. Smart technology was predicted by decades of utopian and dystopian science fiction, and Dr. Berman believes that we can appreciate AI's potential by penetrating the myths about it:

"AI is actually an umbrella of technologies. The current generation isn't creating new content on its own. It's learning because human beings are giving it more information and context to do a better job of responding. So the image of an out-of-control AI that's going to start talking to the students in a way you don't understand, that's not really the present of this type of tool. Another myth is that a chatbot is going to automatically save you time and money. It may not. There's a lot of human intervention. And one of the biggest mistakes that campuses make is buying or implementing something and treating it as something that's going to run itself. That's just not the reality."

Mr. Magliozzi addressed the fears around the sudden emergence of ChatGPT:

"It's called the large language model. It's trained on vast amounts of data from the Internet, and a ton of computational resources. It's remarkable at producing human-like writing, but it occasionally makes false, misleading, and biased content. I've heard it compared to a calculator, but for writing. ChatGPT is really just a tool, and any tool can be used for good or to cause harm. So it's imperative to explore how AI can accelerate teaching and learning."

A Dramatic AI Case Study
Exploring AI's potential begins with studying its uses in higher ed settings. Dr. Renick described how AI-enhanced texting helped at-risk students begin their freshman year at Georgia State:

"We were losing almost 20% of our confirmed first-year class to so-called summer melt. Students would come to orientation, sign up for classes, put down housing deposits, but the term would begin and they were nowhere to be found. So we looked at the data and found that the majority were getting tripped up by the bureaucracy: FAFSA, transcripts, immunization, deposits, registration, and so forth. Disproportionately, these were students from first-generation and low-income backgrounds. What if we built an SMS-based tool that would answer key questions and navigate obstacles to enrollment? So we worked with Mainstay and built a knowledge base of about 2000 answers to commonly asked questions. And these responses were put on an AI-enhanced platform to pick the right response out of the knowledge base.

"We launched this tool in May 2016 for our incoming students, thinking maybe we'd have 6000, 8000 exchanges with the students before the beginning of fall classes. In four months, we had over 180,000 exchanges. The students knew it was a bot. They liked that it was non-judgmental. With student access to easy answers, we reduced summer melt by about 30+%. It was a simple way of scaling personalized attention. Now every Georgia State student across all our campuses has the chatbot from the time they first are confirmed to enroll to the time they graduate."


How AI Informs Leadership Decisions
Dr. Medley shared how implementing chatbots is changing Stony Brook University's relationship with its student population:

"We have a lot of researchers who are doing work on AI technology and how it can enhance not only the student experience, but how we're teaching, researching, and determining things. We're a residential campus where we're on duty from 9:00 to 5:00, and we support students really well during that timeframe. But what happens at eight o'clock at night? We have been able to utilize AI to support students with just-in-time information, when they ask questions that they don't want somebody else to know they're asking. Where can they get a pregnancy test? Where can they find out about STD testing? What if they failed the test? Where can they get help? AI is simply a tool that helps us make sure we're delivering real-time support. That became even more evident during the pandemic, when we wanted to make sure that we could provide services to students no matter where they were, and no matter where our staff were. The potential of this is allowing us to have those deep-dive, beautiful, supportive, higher ed social work conversations with students, eliminating some of the simpler questions and helping them be more successful in all areas."

Virginia Commonwealth University, according to Dr. Nasim, built a custom SMS messaging tool to meet specific institutional needs:

"We developed a technology called Climatext, which allowed us to prompt our students with a particular question, and we would evaluate that data. We got a 50% response rate just in a couple of hours. We were able to evaluate and consolidate this information. And within 48 hours, we developed a document called a climate advisory. We send it to about 250 senior leaders on campus, informing them about students' feelings and thoughts about particular issues. Last year, we wanted to know our students' perceptions about online and remote learning at VCU. We found that our first-year, first-generation students, Pell-eligible students had less favorable experiences with online learning. We were able to make decisions about having the desired percentage of classes in person on campus. Particularly our intro courses, our gen ed courses, that a lot of our first-year students took, because they really thrived in that face-to-face interaction within the classroom. So what we're able to do through this responsive feedback loop is collect information from our students, listening at scale, meeting them where they are, but able to close that loop by responding, 'This is how we're making the student experience better.'"

How AI Engages Students
It's equally important to consider how students respond to their institutions' chatbot initiatives. Dr. Renick believes that text messages are a comfortable option:

"Some students want to meet face-to-face. Other students are never going to knock on a door. So we email or call them, or we have video chat. And we found that texting is not only just another tool, but given that our current students grew up on these devices, it's a particularly effective tool. So don't think of this as replacing personal connections. Think of this as another way where you can build on those connections."

Dr. Medley emphasized how non-traditional students appreciate the extra support:

"Non-traditional students have an incredible variety of things in their way: childcare issues, accessing employer reimbursement programs, veterans' benefits. And by using the AI chatbot, we were able to engage with students. This is a tool that we can use to strengthen our individual connections in real time when students have real need. And I think there is no population probably more attuned to that than the non-traditional population."

Dr. Nasim observed that students can expect a better college experience when they know they've been heard:

"Creating a thriving community on your campus supports student success. And listening and being able to meet students where they are is part of that process. Let students know that their voice matters, that they belong, that they have agency, and that the university actually responds. And the extent to which we can close this responsive feedback loop, that reinforces their uses of the chatbot."

AI and Student Safety
With student safety a high priority for campus leaders, Dr. Berman noted that safety includes data security:

"On the issue of privacy and sensitive information, campuses have to develop a capability to analyze that themselves. And if you don't have people, advocate for getting some expertise in managing private data. One consideration is how many chatbots are you going to have? If you have a lot of different approaches, it's very difficult to maintain an understanding of the privacy issues. There's a good argument for having relatively few channels for communicating, having relatively few vendors, and holding them accountable for protecting your data. And make sure you have expertise on your campus to look into what they're doing, and vet that."

Mr. Magliozzi spoke to concerns about students' physical safety:

"We're a vendor and technology company, but we consider ourselves fiduciaries of your student body. There's a multitude of ways students have expressed struggles, including potential self-harm. And these things will typically occur during midterms and finals, during the start of a semester when there's transitions happening. We escalate this to advisors. You're approaching it from other areas, from, 'Hey, we're here for academic support, but we're talking to you in high-stress moments.' So prepare for the worst, but recognize that it's likely to be an extremely small number. We've done hundreds of millions of messages, and it's only been about 100 total in the life of the company. But each one of those is of massive importance."

Dr. Renick added:

"When you're listening to students, you have to be responsible in responding to what you hear. And we have live staff that is monitoring the chatbot ready to respond in these kinds of circumstances. When students have mentioned mental crises that they haven't talked about with a counselor, or family members, or a roommate, we've been able to get them help."

Benefits and Promises of AI in Higher Ed
Our panel guests agreed that, while AI is a useful tool for communicating with a student population, it's just one piece of a larger strategy. Dr. Berman said:

"A chatbot is a decision about how you want to serve students. AI is a much more sophisticated way to communicate, but you have to start by asking: What is it the student needs? What are the goals of your institution? What are the problems that you're trying to solve?"

Mr. Magliozzi suggested:

"To get the full benefit of AI, weave it into the process. Assign your best people to train it. And when mistakes happen, use them as teaching moments. Texting is the most democratizing form of communication. You have to be incredibly concise, but from the limitation comes a huge opportunity to get signal right through the noise in both directions."

Dr. Medley observed:

"I see AI helping us lead students through those task-oriented, 'do this, do this.' And I see our staff continuing to grow those relationships, being supportive of students and giving advice. That's what we want them to have the time to do. The chatbot takes the task, our staff builds the relationship."

Speaking as VCU's chief diversity officer, Dr. Nasim told us:

"We use the chatbot to understand organizational culture and climate. We're able to get the impressions, perceptions, and thoughts of all our students, and then make decisions within a very short amount of time that involve the general population of students. I think there's an opportunity to refine this in such a way that we can tailor our messages to populations that really need to hear them."


Note: This blog was adapted from a panel discussion hosted by The Chronicle of Higher Education on 1/31/23. Co-hosts were the Chronicle's Assistant Managing Editor Ian Wilhelm and the UIA's CEO Bridget Burns. If you would like to learn more, we provide a video of the event and a transcript of the full conversation.  

Stay Current! Check out our Blog Go Now

or check our videos YouTube