Episode Transcript
[00:00:00] Guest: Every single time it generates a new answer because content may change, which means that it can't lock down that response after the first response and say this is the correct response. And so because it generates a new answer every time, you don't always know what you're going to get. And this is where content and really understanding what content you're feeding the AI is critical to ensure that the answers being delivered are accurate.
[00:00:33] Host: Welcome to the EdTech Connect podcast, your source for exploring the cutting edge world of educational technology. I'm your host, Jeff Dillon, and I'm excited to bring you insights and inspiration from the brightest minds and innovators shaping the future of education. We'll dive into conversations with leading experts, educators and solution providers who are transforming the learning landscape. Be sure to subscribe and leave a review on your favorite podcast platform so you don't miss an episode. Sit back, relax, and let's dive in.
Welcome to the show everybody. Today we have Nick Burrell. Nick brings a wealth of expertise in educational technology and artificial intelligence, making him a standout leader in the field. As the VP of Strategic Partnerships at ZogoTech, a company specializing in data analytics solutions designed to empower higher education, he works closely with colleges to unlock the potential of their data for student success and institutional growth.
Prior to ZogoTech, Nick spent nearly a decade at Ocelot, a leading AI powered student engagement platform, and he held various leadership roles culminating as Senior Vice President of School Partnerships. At Ocelot, he worked hand in hand with educational institutions to implement cutting edge technologies driving meaningful improvements in student engagement and outcomes, and was one of the original creators of their chatbot. His unique blend of expertise spans the strategic implementation of AI tools, deep knowledge of content development, a nuanced understanding of the challenges and opportunities in education. His insights into the evolving role of AI offer a clear vision for how institutions can adapt to and thrive in the future of learning.
Welcome, Nick.
[00:02:28] Guest: Thanks for having me, Jeff. Excited to be here.
[00:02:31] Host: So you have an interesting background. You spent nearly a decade at Ocelot, where you work closely to adopt new technologies. And I'm curious, how did Ocelot's use of AI evolve over that time and what were some of the key lessons you learned in terms of a practical application in education?
[00:02:51] Guest: Absolutely. You know, I think one of the things to kind of add to that is previous to Ocelot, I actually worked in college financial aid offices for over a decade. And so it was a really interesting kind of switch to go from an institution to the vendor side, supporting colleges, knowing that colleges really need a tremendous amount of help with the workload that's on their plates. And at Ocelot, what we saw was a true evolution. The company has been around for over 20 years. When they first started, they were making print publication handbooks for financial aid. Then they pivoted to making videos. And then there came a point in time early on at my time in Ocelot, where the CEO at the time, Damon Vangelis, you know, was looking around and saying, the videos are nice to have, but the need to have that's coming is finding ways to connect with students 24, seven, you know, create engagement. And the way in which we think we should be doing that is through AI. And with all the videos that we already had to answer student questions, we already had the answers written in many cases for a chatbot. And so that was really a normal extension of the next step was to create an AI chatbot. And what we saw in evolution from there was adding AI into additional communication channels, whether that be texting or live chat. You know, for example, in a live chat, the AI assistant can serve the live chat agent up with suggested answers based on the student's question, right to that live chat agent, really facilitating the live chat agent to be able to manage multiple conversations at once. And if a question comes in an area that isn't in their expertise, still be able to provide the student with a response as opposed to having to bounce them from one office to another. And then I think, you know, as there were a couple of different really big events that occurred after that. So with the start of AI at Ocelot, we started building around 2017. We launched the initial version of the product in July of 2018. And then as you know, the pandemic came in 2020, where I think colleges were really looking for ways as they were closed to figure out how they were going to connect with students. Some of them didn't have phone systems for months and they weren't getting messages and your email boxes were full and how are we getting students these answers? And this is when we saw a very quick adoption of the solution. And then shortly after that was the release of ChatGPT through OpenAI. And that really revolutionized how AI is being utilized in terms of its ability to truly have these large language models that are interpreting all of the text that's coming in and providing robust responses, you know, from pre built content on websites already. And that's where Ocelot is going now. You know, from, from, from my understanding.
[00:05:46] Host: I Remember, I think 2022, November 2022 is when ChatGPT was launched by OpenAI and before that, chatbots were synonymous with AI in higher ed. That's what you would say. Even on our periodic table on EdTech Connect, it was Chatbot was a category, you know, there wasn't AI. So it's really interesting to see how it's really expanded.
What I'm really curious about with AI is the amount of content being generated now and the how quickly we can scan all this content and consume it. And from your perspective, how does the content quality influence these AI tools? Like, like chatbots.
[00:06:26] Guest: To your point, Jeff, content is king. If the content is incorrect, whatever's being fed to the AI tool is going to be incorrect when delivered back to the respondent. It's not fact checking. Right. And I think that when we think about Things like while OpenAI and ChatGPT are unbelievably powerful tools, we still see hallucination or incorrect responses more than 50% of the time. And that is a concern about the information that folks are getting and ensuring that, you know, they have to fact check what's being given to them because they can't just assume it's correct. These tools are not deciphering what answer is right. So, for example, if you're on your website and you notice that you have two different answers to the same question, or maybe a better example is actually three different answers to the same question, 33% of the time it's going to get it right and 66% of the time it's probably going to get it wrong because it doesn't know the difference. And when it generates an answer, it doesn't generate an answer, and then say, this is the correct answer going forward, I'm going to use this answer every single time it generates a new answer because content may change, which means that it can't lock down that response after the first response and say this is the correct response. And so because it generates a new answer every time, you don't always know what you're going to get. And this is where content, and really understanding what content you're feeding the AI is critical to ensure that the answers being delivered are accurate.
[00:08:04] Host: So I want to tell you the story and get your reaction. About five years ago, when I was at Sacramento State, we had the president's office call it and say, hey, we have a problem. There is the wrong general education requirements are on our website. And it was bad. Somebody that knew the president was like looking for major information, you know, and I had to track it. Down. I was the director of web there and realized we had six versions of GE requirements on our website. And it was embarrassing. It was like. The thing was, it was back in the day we had a basic search, I think we had free Google search. And that's what found it. Now that's going to be found. So, like it's so easy for AI to scan everything and find it. We're going to find that quicker. More of those ports problems are surfacing. Have you faced any instances where the quality of content caused AI tools to provide inaccurate or misleading responses?
[00:08:55] Guest: Yeah, I mean, so that's a really interesting story that you shared. I have a couple of thoughts around that. I mean, one of the first things is this. It's very, very easy when you are building out a website to say, I'm going to archive a page, it's no longer useful. I'm going to archive this web page and build a new page. And this is how we start to get more than one answer on the site. Now, sometimes the page isn't archived and it's live. But with ChatGPT or with an OpenAI or a generative AI type of solution, right. It doesn't just have to be the most popular one that everyone is aware of, it is searching URLs that you provide it. And so if you provide a top level URL that has a hundred archive pages attached to it, it's going to search those pages too. And so a lot of the work that needs to be done around, you know, content is a little bit different than it used to be. Archiving a page is no longer a best practice in my opinion. It's really deleting a page and getting rid of that page so that that content is no longer floating out there somewhere.
And I think that that's a real issue for schools because archiving a page never had that kind of implication or impact before.
So knowing what pages are even attached to your URLs, pages that you wouldn't even think somebody could find, generative AI will find that page. And so I think that's part of the issue. I think that it is common to see incorrect responses. And this is where limiting the information that is provided to the generative AI tool that you're using is critical. So as opposed to opening it up to the entire, you know, World Wide Web, you would want to focus it on just your institutional web pages. Or maybe you would be looking at FAFSA web pages from the Department of Education as well. Right. Or something from the National Student Clearinghouse but you would be really defining your sources to something very constrained so that it's not looking at extraneous sources. That's one way to help limit incorrect responses. But then I do see, you know, even small nuances. For example, a school had on their website a response that was written in the third person.
And so the bot responded to the answer in the third person, and it felt impersonal. And I wouldn't even necessarily say the content was wrong as much as the tone wasn't right. And so it's a combination of tone, content, you know, accuracy, all of that kind of being pulled together.
[00:11:40] Host: It's really interesting because we're using AI on both sides now. We're using it to find it, but we're also using it to create it. And so it can be such a great a blessing, but really difficult to really manage. So what practices do you think can be applied to ensure the AI tools rely on accurate, updated, reliable content?
[00:12:02] Guest: You know, Jeff, this is a place where I think you might be able to give a perspective as well, having managed a website before. But from my experience, a lot of resources are put into the marketing portion of a website. The look and feel, maybe the navigation design, but less time is spent on the content itself. You know, a lot of times when we hear schools tell us, oh, we're going to be doing a overhaul to our website, what they really mean is that they're changing the platform that's hosting the pages and that they're going to look different, but the content hasn't changed. And so I think that a true best practice. There are a couple of different things that I would. I would think about. The first is doing a truly extensive website audit where you're looking at every single page that exists, archived, not archived, you know, to determine where these data sources may be being pulled from.
[00:13:02] Host: What else are you looking for in that audit?
[00:13:04] Guest: I'm looking for a combination of things. I think the first thing is just what pages are out there, and then the next piece is, okay, of these pages, which are old, which have completely outdated information, Are there duplicate pages that are providing the same information, and are those answers different? And then the question is, well, do we consolidate it to one page? Do we really need these two different pages? Sometimes you do need the two different pages because they're owned by different departments on campus, and they each need their own page. But you want to make sure that that information is consistent from page to page. And what I would say is, I would write the answer once and use the exact same text on all of the pages where you're trying to answer that question so that you create consistency. The other thing is then you have a heat map as to what content lives where so that when an update has to be made in the future, you know, you're updating the following four pages. You know, it's not just about identifying what's there now, but it's also about identifying, you know, what needs to be maintained in the future to ensure that things aren't slipping through the cracks.
Ocelot actually has a tool where if you plug in a top level URL, it will show you all of the different URLs associated with it. So if you're an Ocelot client or a client of another AI provider that allows you to plug in a URL and see all of the pages it will search, this is one great way to do historical audit of all of the pages that are out there. It will show you every single page it's going to pull an answer from and then you have the ability to turn off those individual pages. Right? So that I think is one way to manage what pages are being pulled in to look at, you know, kind of a full audit of the site.
[00:14:56] Host: I think that makes a lot of sense and it's, I think it's easier said than done. Smaller schools, it's going to be easier. I think they're, they're more centralized and one of the projects I went through was a full burn and build website redo. And when we launched the new site, we said, hey, no one can create a new page without IRT re stamping the page out and giving it to you. You can't just create a new page on your own. We thought that was going to wreak havoc. Seemed to be okay. You can create new pages. So controlling that because we have hundreds of people creating pages, these sites. So creating once, linking everywhere is kind of great to strive for. But in reality it comes to this thing we are all talking about, which is digital governance. You know, what does that mean? Every, most schools are behind or they feel like they're behind in their digital governance. What does that mean? It means like curating. Who can create content? Does the content follow all the rules, the school rules, the federal laws, all these processes, what CMS, there are multiple CMSs, which CMS are we using? All these things are really hard to manage when you have dozens to hundreds of people managing content. So you know, one thing was like we do an audit for stale content. So that means, you know, if a page hasn't been touched for a year. We're not going to delete it because there's a lot of information that just has to be up there. It's just an approver has to say yes, this is still good every year. Sacramento now has this process to do that. So it's, it's a challenge. But in the biggest goal, the more decentralized, the more that we have to rely on that digital governance. And I even work a lot. You probably know I spent the last few years of my career in content discovery, right? Search. I'm really into search. But even having a great search tool on campus that you can control and say, okay, we need to deprioritize the archive folders, right, for the example you're talking about. But AI is finding everything anyway around your search. So it all keeps coming back to the quality of your content. Almost every school has to keep up with that. So you're right, it's a tough challenge.
[00:16:53] Guest: Well, and I think too, you know, one of the things kind of piggyback off that is that when Ocelot started building the bot, or rather, you know, not to single them out, actually, I think pre OpenAI, when tools were being made, it was very manual to do AI training and we saw, you know, many different approaches. One of those approaches was to create a knowledge base, right. That you curated responses within the AI platform and the AI platform would use those answers. And that is an excellent tool and it is one way to control what is being delivered to your students.
With that being said, a limitation of that is that if a student asks a question outside of the realm of what the bot has been trained on, it's not going to provide an answer. And this is where OpenAI really provides that extra boost. Because for questions that are outside of a pre built knowledge base, it can generate answers on the fly so that the students are always getting an answer to their question and they're never getting an. I don't know. The issue with that is now is that the website is becoming your knowledge base as opposed to having that predefined knowledge base that you're only pulling answers from. And so I think it's really a shift in thinking that instead of maintaining an individual knowledge base, which you still may want to do, your website is your true knowledge base and it needs to be given the same TLC you would given a knowledge base that were you were using to feed AI answers before.
[00:18:30] Host: Yes, you're right on. I think with kind of that take on it, I think one of the challenges I would, I'm envisioning is being coming from a decentralized environment where we did a pretty good job of centralizing. When you're trying to do a content audit, you really need subject matter experts at every corner because you, even you, Nick, if you're auditing a website, you could probably kind of guess at, oh, this looks like redundant. This doesn't look like it should be here. But how does a school do it? Is it a project for multiple people in different areas to say, you know, we all need to like manage our own content? Are there some tools or ways like are we smart enough to be automating this? It seems such a manual lift, almost intimidating for schools.
[00:19:15] Guest: Absolutely. And I think this is a place where bringing in a consultant to assist and help project lead so that they're taking off a lot of the heavy lifting. But then the SMEs are being brought in, you know, at that point in time where they can fact check certain items so that they're not doing all of the work of the audit, but they're certainly being involved in the creation of the content really can help leverage that. Because I think, you know, the initial overhaul is a one time job and if you do it right the first time, then you're really just doing maintenance after that. And so having somebody help, you know, kind of manage that lead, bring the findings to the team so that they can then decide this is what needs to be done based on the issues that were found. Instead of me spending, you know, as a staff member hours trying to find the issues, let somebody else highlight those issues for me, work up a report and then let's just as a team decide how we're going to handle those issues and create policy from that going forward.
[00:20:15] Host: So I want to talk a little bit about the intersection of AI and content management. And one quick little story I have is the first time I kind of saw experience ChatGPT, I thought, wow, every CMS now should be scanning their own content and creating the metadata that doesn't exist. Right. We have all the information about this page. Let's create the metadata with AI and just have that done. And I've seen schools trying to do this and that's being more challenging than you would think with all the different jargon and acronyms and the way higher ed labels things isn't the way the language that people search. So even that no brainer case I thought like just have metadata created automatically should just happen is challenging. Kind of leads me to this question of what trends or innovations do you foresee a, at this intersection of AI and content management.
[00:21:06] Guest: So I think you bring up a really valid point. And this is where I still do believe that regardless of the technology that's being used, there are many providers that are specifically higher ed focused around AI communication, AI student engagement. And a lot of these companies have spent a tremendous amount of time layering on top of large language models, their own model that truly understands higher education terminology and vernacular. And I think it's really critical because these tools really do understand what are, what is being said at a college so that when a student asks a question, it can interpret that and match that. Because, you know, one of the things I found really funny for a long time, and I'll tell a quick story, is as we were showing colleges the bot when we first built it, early on, you know, 2018, 2019, we would have leaders on the call say, well, ask it, what's R2T for ask it? Something like a course program of study. Right.
The student is never going to ask the question that way. Right. For regulatory purposes, we want to make sure that we're recognizing some of these terminologies. Right. But we needed to do quite a bit of work on the back end to say, yo, where's my money and when am I going to get my refund? Are the same thing. And that's where I think, you know, the higher ed vendors that provide this type of solution, that are really focused niche on this industry, have done extra work to really understand that. Or from a ZogoTech perspective, for example, we work with colleges to build a data dictionary. That data dictionary is so critical in terms of being able to define what do you mean when you say enrollment? Right. Because there could be eight different definitions of that, you know, what do you mean when you say gpa? So that when in the future, Zogotek is getting ready to release a reporting tool that utilizes AI to build reports based on questions. Right. It's kind of like an IR AI assistant, so to speak. And the reason that we are able to provide something like this is because of that data dictionary that defines the semantics underneath. So that language understanding is something that you should be digging in with, with the vendors that you work with to really understand how they are approaching higher education vernacular and how that's being interpreted in terms of them providing a response. And that's less of what a school can do and more of what's being provided. And that's where I would be digging and asking my questions.
[00:23:52] Host: Yeah. In the first part of that, where you talked about how we need to match that vernacular to what a student, the way a student searches for tuition costs. I like to talk about that one too. Because if, if a school is doing well, they're going to be matching, like, what will college cost me? And they will be showing tuition results if they can figure that out. But I think we need to leapfrog that. And when someone searches for like, what will college cost me? They need to find not only tuition costs. What about international student tuition in state, out of state scholarships, financial aid, all the stuff the student doesn't think to ask, you know, so we need to have that ready. And AI is going to be, I think, going to be figuring it out. But the real reality is that the stakes are so high. So we see all these examples in the world of, oh, everyone's just using it right now, but look at the hallucinations we've seen, the problems with, like, I've seen airline tickets being booked incorrectly. We've heard of these. You know, the stakes are so high that we higher ed can't be the one that just jumps on board right away. So it has to be this kind of hybrid curated approach, you know, is kind of my take.
[00:24:58] Guest: I totally agree with that. I think it absolutely needs to be a hybrid approach. I think that there should be some caution taken in terms of how some of these things are being done. And then I think it's about looking at the approaches that different people take. So, you know, for example, from my experience with Ocelot, Jeff, you brought up the example of tuition. I really like this example. You can answer that question in a variety of different ways. And if this is a prospective student that hasn't registered for classes yet at a community college, for example, then you really may be giving them a ballpark figure. But if this is a student who's registered for classes and they're asking about tuition. Right. Ocelot offers integration with student information systems, lms, erps, all of these different systems, so that students can then log in and get a personalized response. And now this becomes less about arbitrary understanding of the English language and amalgamating an answer and getting students a true response. Right. And this is where I think, to your point, a hybrid approach where we're delivering personalized information and the AI is just interpreting the question coming through the door so that it can say, okay, I'm pulling this from your account and I'm giving you this answer is really where true intervention can occur. Right. So another example of that might be a student that comes in and asks and says, when is my registration date? And the bot serves up the opportunity to log in with personalized information to get a personalized answer. And it responds and it says, you know, your registration date is this day and this time, however, you have a hold on your account for a past due library fine for $15 or you have not met with your advisor and this is their name and their email address and you need to meet with them before this date in order to register.
When students are asking questions, ultimately they are trying to get to an end goal. And when I worked in college financial aid, when we would interview people, I was always looking for this answer from new counselors, which was, I am trying to figure out what it is the questions the student isn't asking me that I need to answer for them while they're sitting in front of me while I have them captivated for this moment. And that doesn't change. And so I think with AI, it's figuring out how to leverage some of that natural language understanding to provide 24. 7 support. But it's really about the logic in terms of what's being provided to them in a personalized response. If they do this, we should be checking this, this, this, this, and this in the system before we give them an answer and give them a holistic response. Right? Not just their registration date, but anything that would hold them up from registering, because that's really what their question is. They're not asking just when they're going to register. They want to make sure they can.
[00:28:00] Host: And they don't know what to ask.
[00:28:01] Guest: Right.
[00:28:01] Host: We need to give it. And this is going into personalization, which we'll have to do a whole nother podcast for this. Or you should listen to the other episodes where I talked to Dallin Palmer from Halda and artist Kaidu from Element451, who were doing that very thing. So I'm going to ask you this last question and then we'll wrap it up. But what do you believe is the most important question that institutions should be asking themselves right now as they consider the future of AI and its impact on learning.
[00:28:29] Guest: So I think that the question that they should be asking themselves is, how do I make my campus AI ready?
I think a lot of folks, especially with the pandemic, it accelerated the need for these types of tools and the technology was emerging as this massive world event was occurring. And it resulted in adoption of this technology much faster than I think would have happened if that hadn't occurred. And what occurred there was this concept of we need AI to support because we need virtual support. And really now, as we take a step back, the question I would have asked before we did that quick adoption of the technology, which was really kind of necessary for the time, is what do I need to have done to be ready to bring AI onto my campus? Is my website up to date? How am I going to do that? How am I going to maintain that content? Do I have a data dictionary where I can define those terms so that if I bring in an AI reporting tool, it knows what I'm asking about?
What are our security policies around AI? Right. You know, I do think that schools should be asking institutions this question. If you're utilizing generative AI and you know who the student is, where is that data living? I know that in your platform, you're not sharing it with anybody, but who's powering your generative AI? Is OpenAI still storing this data, which then is out of your control? How do they choose to use it? Right. So these are some of the things I would be getting in place now and then figuring out what technology marries in with the policies that you've decided in terms of what's going to make you the most useful AI campus or the most useful campus in terms of how you want to utilize AI in the future. But I think that step has been skipped because of the need for quick adoption. And I think we all need to take a step back and really ask ourselves, you know, what do I need to do to be AI ready? Because AI tools are going to continue to evolve.
They're evolving quickly. And so I expect to see in the next five years some real advancement here. And the question really is, do I have a foundation to be prepared as these technologies continue to evolve, to continue to bring them on? Because at some point, you're going to get to a place where if the website content isn't right or you don't have a certain piece of foundation, you can't bring on that technology until you fix that first.
[00:31:05] Host: I totally agree. So everybody get your content ready. AI is coming, and I want to thank you for being on the show.
[00:31:11] Guest: Thank you so much, Jeff. Great to be here.
[00:31:17] Host: As we wrap up this episode, remember, EdTech Connect is your trusted companion on your journey to enhance education through technology. Whether you're looking to spark student engagement, refine edtech implementation strategies, or stay ahead of the curve in emerging technologies, EdTech Connect brings you the insights you need. Be sure to subscribe on your favorite podcast platform so you never miss an inspiring and informative episode. And while you're there, please leave us a review. Your feedback fuels us to keep bringing you valuable content. For even more resources and connections, head over to edtechconnect.com your hub for edtech reviews, trends and solutions. Until next time, thanks for tuning in.