ChatGPT grabbed headlines this year for making generative AI—tools that can create text, images, and other media based on predictive algorithms trained on large training data sets, often scraped from the open internet—accessible to millions and for the ethical and sometimes existential questions the growing use of AI raises. This simplified the work of Core’s Top Technology Trends Committee: “When ChatGPT exploded on the scene, we immediately knew what that top tech trend was,” said Kate Delaney, committee chair and public services librarian at Philadelphia College of Osteopathic Medicine.
Delaney moderated the “Core Top Ten Technology Trends: Libraries Take On ChatGPT” panel Saturday, June 24, at American Library Association’s 2023 Annual Conference and Exhibition in Chicago. Four panelists from public, academic, and school libraries shared thoughts on what library workers should know about the technology, how it’s being used, and the potential risks and benefits.
“Students are already way ahead of us in using [AI tools],” said Hannah Byrd Little, director of library and archives at The Webb School in Bell Buckle, Tennessee. She recounted asking sixth graders to define a word several years ago, only for them to pull out their iPhones and ask Siri. They’re using tools like ChatGPT for information seeking and to supplement critical thinking in their writing. “They’ve grown up using AI.”
Generative AI’s growing ubiquity means that libraries risk irrelevance by ignoring this technology, said Jonathan McMichael, undergraduate success librarian at Arizona State University. “This is going to be a big part of a lot of students’ lives and a lot of our user’s lives,” and should be integrated into the classroom and other learning opportunities, he said. “You can’t have a house without a foundation,” added Trevor Watkins, teaching and outreach librarian at George Mason University. When teaching students about ChatGPT and other generative AI tools, he starts with the fundamentals of information literacy, which is sometimes not well-developed, especially among students who are earlier in their studies.
For those concerned about students using AI tools to cheat, panelists suggested thinking about the context in which it’s used. “Is it enhancing critical thought or replacing critical thought?” McMichael asked. “Don’t be afraid that it’s doing some of the work for the students.” It can also be helpful to share how ChatGPT might answer an essay prompt in class, and then dissect where its answer works well and where it falls apart, said Fernando Aragon, learning labs supervisor at Gwinnett County (Ga.) Public Library.
Watkins cautioned that some are using ChatGPT as a citable source and style guides are creating citation formats for it. However, “if you ask it to cite a source [for an answer it generated], it’ll usually just make something up. It’ll just create authors that don’t exist,” Watkins said. Authority is something Little has emphasized to her students. “I really drilled into the heads of my kids before they left for the summer: This is a tool, not a source.”
Anyone who uses generative AI tools like ChatGPT should know how they create answers to prompts. The panelists encouraged people to experiment with AI tools themselves. It’s especially helpful to give the tools prompts about things you’re well-versed in, Aragon said. This allows you to get a better idea of where the technology may have sourced things and also where it comes up short. McMichael also suggested taking a prompt engineering course, because to make good prompts you need to understand how the tools generate answers. Watkins suggested a “Chat GPT listening party,” where a group plays with the tool and has a discussion.
“Also keep an eye on sources that may not be so great,” Aragon added. Checking YouTube and other sites that the general public will check for information can also help keep you informed of the types of misinformation that are out there so you can counter it.
In addition to helping students and patrons understand and use generative AI tools, libraries also deserve a place in the debate around ethical considerations and the future of the technology. Watkins pointed out that US Congress has held hearings about AI this year, but lawmakers were “befuddled” as to how to regulate the industry. The US is way behind the international community in terms of AI regulation, he added. “There is a collective action problem here,” McMichael said. There is individual incentive to use AI because it can help you get ahead, but there is risk to everyone when tools are misused or weaponized. “Libraries are a really good place to be talking about this stuff because we’ve been thinking about collective action for information for a long time.”