Harnessing AI: AAI and KU Researchers Advance Innovation and Critical Understanding Across Fields

LAWRENCE — As artificial intelligence has become more advanced and easily accessible, KU researchers are looking to better understand the technology and the ways it can be used in various disciplines and industries. Maggie Mosher, assistant research professor at the Achievement and Assessment Institute, said that, despite the perception that AI use is new at the university, KU is home to numerous exciting AI projects.
“Although AI use became more prominent in the years following the release of easily accessible chatbots, such as ChatGPT, AI use is not new at KU,” Mosher said. “It is difficult to know all the research occurring in every department at a university this size. I’m grateful this past year for the collaboratives formed, such as the AI Think Tank, to assist faculty and the community in growing more aware of the AI work occurring on campus.”
In 2024 alone, there have been numerous studies and projects involving AI at KU. Tera Fazzino from the KU School of Psychology is participating in a project to use AI in food deserts to expand access to fresh food. Michael Lash, assistant professor of business, published a paper on incorporating human experts into AI learning models to improve reliability. Sean Smith from the department of Special Education is exploring the impact of virtual reality on the social emotional development of struggling learners.
Artificial Intelligence has been around for decades in one way or another, just perhaps not the type that has recently taken off and turned the world on its head. With the emergence of ChatGPT and other tools like it, the way we think about AI and technology has dramatically changed in a short period of time.
“AI’s current pervasiveness and growth is why researchers need to pay attention and recognize that forms of AI were in use decades ago,” Mosher said.
“Machine learning, natural language processing, and voice recognition have been a part of everyday tools in Microsoft’s Excel and PowerPoint, Amazon’s Alexa, and Apple’s Siri. Precursors to our current large language models, like Poe, ChatGPT, and Microsoft Copilot, were around in 1969 with the development of Eliza, a natural language model computer program developed at the Massachusetts Institute of Technology that simulated conversation. These tools aren’t going away. If anything, they are only becoming more accessible.”
Faculty at KU work hard to guide students on their path to meaningful and impactful careers. Employers across various industries are increasingly prioritizing candidates with AI skills. A 2024 report by LinkedIn and Microsoft revealed that 71% of employers would rather hire applicants with AI expertise, even if they have less overall experience. The 2024 Cengage Group Employability Report from Inside Higher Ed reported that 62% of employers from various fields believe hires must have knowledge of generative AI.
“We must help prepare students to use AI accurately, equitably, and effectively,” Mosher said.

AI at the Achievement & Assessment Institute
Mosher was recently invited to present at the Ai4 Conference, “North America’s largest artificial intelligence industry event,” according to the conference’s website. The event took place in August 2024 and included over 8,000 attendees. She was one of four international speakers recruited for the panel titled “Leveraging AI for Education.”
Mosher said that although the work being done at KU in AI has less to do with creation and more to do with understanding and application, many projects are being worked on here at the university that aren’t being done elsewhere. Mosher works with many centers across AAI to research how AI affects education and how it could change the entire discipline to help educators and learners. She was recently awarded over $5 million on two grants utilizing AI.
One grant for the Office of Special Education Projects (OSEP) capitalizes on AI’s ability to individualize instruction and provide timely, detailed feedback to teach students with disabilities social skills within an extended reality platform. The grant also includes leveraging AI to provide a free web-based platform, which assists educators in generalizing the skills students learned in the virtual world into the classroom. The other grant seeks to use AI to reduce the demands placed on teachers and increase teachers’ time to build meaningful relationships with their students.
Part of Mosher’s work at AAI includes running randomized control trials with elementary and middle school students and their educators to determine how best to teach students to safely and effectively use AI equitably.
“We use existing AI frameworks with students to help teach them how to determine the accuracy of the information generated by AI and the ethical implications of using it, such as: How do I tell if this information is valid or not? Should any obvious biases be addressed when reporting the AI generated information? Where does the information I type into chatbots go? Once students have this knowledge, they can make informed decisions under the guidance of their classroom teacher and parents,” Mosher said.
Mosher also works with AAI’s Center for Reimagining Education, a center dedicated to using AI to transform teaching and learning in K-12. Led by Director Bart Swartz with co-founders Rick Ginsberg, dean of the KU School of Education and Human Sciences, and Yong Zhao, a distinguished professor in the Department of Educational Psychology. The center collaborates with school district leaders, teachers, and students to find new and innovative ways to make school more engaging and effective through AI.
The center was built based on a book by Zhao and Ginsberg about the lack of change in schools titled “Duck and Cover.” The book focused on reasons schools struggle with change, including around technology use as AI is impacting so much of what we do.
“We want to help educators think about ways to use these tools and reconceptualize what they do so that we're not writing books about how things never change,” Ginsberg said. “How can we transform education away from what we've been doing for 150 years and make school more engaging for kids? That’s what we are trying to figure out with this center.”
CRE aims to create “schools within schools” so that educators can experiment with what works and what doesn’t on a small scale. CRE also ensures that students, school leaders and teachers are the ones deciding what would work best for their school and students.
“Rather than prescribe what the school should do, we ask them to create something on their own that challenges and disrupts traditional classroom models and empowers students as leaders,” Swartz said. “We are trying to help people support and teach one another as they learn about AI, and one of the great ways to do that is to give them freedom.”
In March of this year, the schools will come together to present their personalized approaches and the outcomes in a showcase. Some examples of what schools have done include creating student-driven afterschool clubs, revamping the school library media center, and offering opportunities for students to work on AI-centered passion projects.
“We want to marry together the power of AI with all of the research that talks about the power of student choice and personalized education,” Swartz said. “We've known for a long time that personalization helps students succeed. I believe we have greater resources than we've ever had to accomplish that thanks to AI.”
Neal Kingston, director of the Achievement and Assessment Institute and vice provost for Jayhawk Global and Competency-Based Education, has long recognized the growing role AI would play in education research and practice as well as the controversy it would generate.
“One of the first things I saw was a knee jerk reaction that AI would enable students to cheat on their assignments and that it should be banned. Thinking that it could be banned is incredibly naïve, but like any tool, especially a tool as powerful and transformative as AI, students must be taught how to use it responsibly and well,” Kingston said.
To this end, Kingston and AAI at large lent support to several initiatives involving AI within and outside of AAI over the last several years, including the creation of one of the institute’s newest centers, Flexible Learning through Innovations in Technology & Education (FLITE).
Headed by Lisa Dieker, FLITE’s mission is to provide an integrated structure for emerging technologies aligned with student and teaching learning. A significant part of FLITE’s work involves AI in some capacity, including current projects in developing artificial intelligence agents to support students with disabilities in inclusive settings. Because of overlapping research interests and expertise, Mosher works closely with the center.
Dieker came to KU from the University of Central Florida where she was a Pegasus Professor and Lockheed Martin Eminent Scholar in the College of Community Innovation and Education. Coming to KU and becoming the director of FLITE has been a significant but not unwelcome change.
“Coming from other institutions that were either very technologically focused or not technologically focused at all, I find KU in the AI space to be at an exciting crossroads between creation and usage which is an area of strength that I'm trying to tap into,” Dieker said. “I think we're probably leading the nation in using AI creatively.”

The KU AI Think Tank
In her previous position, Dieker had many opportunities to have close connections with her peers and chat daily about new ideas and what was going on in other parts of the university. She said that because the KU campus is so spread out and researchers working on AI are not all housed under the same roof, these sorts of quick chats are difficult to have. In an effort to remedy that, Dieker started an AI “Think Tank” as a way to get KU researchers working with AI together to share updates and enhance collaboration, especially on proposals.
“I Googled every CV at KU that had AI in it and invited them. I didn't know if anybody would show up, but around 40 people came to the first two 30-minute sessions,” Dieker said. “It was so surprising and exciting to meet so many people doing similar work. I didn't know that we had somebody who wrote some of the international standards for AI through usage, and I wondered, why didn't I know you before?”
The group meets monthly, and members take turns presenting their work and updates on their research. They also take 15 minutes to informally walk or talk to get to know each other better and brainstorm about possible collaborations or proposals.

Ethics, Education, and Innovation
One of the members of the Think Tank is Kathyrn Conrad, professor in the Department of English. Conrad takes a slightly different approach to AI research, as her focus lies mainly in the ethics of AI and its implications and has written extensively in prominent AI journals and presented on critical AI literacy at conferences. She wrote the “Blueprint for an AI Bill of Rights in Education” and serves as associate editor on Education Policy for Critical AI (Duke University Press) and on the founding advisory board for Harvard’s AI Pedagogy Project. Earlier this year she was part of an expert panel advising the UN’s Special Rapporteur on the Right to Education on the use of AI in education.
“Critical AI literacy is about understanding the landscape of AI. It is not just how to use the technology and not just upskilling or applying it but understanding its known harms as well as the opportunities and potential benefits of it,” Conrad said. “And that includes everything from understanding its copyright and environmental implications to its impact on cognitive development.”
Most recently she and Sean Kamperman received the highly competitive NEH Institutes for Advanced Topics in the Digital Humanities grant for $218,732 to fund their institute, AI & Digital Literacy: Toward an Inclusive and Empowering Teaching Practice. The goal of the institute is to be a place where humanities educators can discuss and brainstorm solutions to challenges presented by AI and its rapid evolution and integration into the realm of education.
“We really want to be able to think about AI - not only develop it but also think critically about it in a variety of different areas before it's deployed more broadly,” Conrad said.
Conrad also believes it’s important to talk to students about AI and to ensure that they feel comfortable bringing it up to their teachers and to also keep teachers’ and students’ personal views of AI in mind.
“We should be transparent about the work that we're doing. It’s great if instructors at KU employ AI if they're doing it in full critical knowledge, but I think it's important for teachers to have the critical literacy to be able to make those decisions and for students to be able to opt in or out,” Conrad said. “Some students don't want to be doing something with AI if they think it's unethical, so it’s important that educators have an option to do something else if it's required.”
On the other side of the aisle is AI in computer science. Jennifer Lohoefener is the associate director of KU’s Institute for Information Sciences (IS2) and an assistant research professor in the Department of Electrical Engineering and Computer Science (EECS). Her research primarily focuses on formal methods, employing mathematical frameworks and technologies to rigorously prove and verify properties of software systems, with a particular emphasis on AI within educational technology.
“My end goal is to figure out if we can make these systems trustworthy. Can we create some assurances around these systems and guarantee that they’re going to do what they say they're going to do and not what we don’t want them to do?”
To answer these questions, Lohoefener is working with James Basham, professor in the Department of Special Education and director of AAI’s Center for Innovation, Design, and Digital Learning (CIDDL). CIDDL supports the use of educational technology in early childhood special education and K-12 learning environments to improve outcomes for students, especially those with disabilities. In 2024, Basham served as an advisor on a report on AI integration in education from the U.S. Department of Education’s Office of Educational Technology (OET). The report, titled “Empowering Education Leaders: A Toolkit for Safe, Ethical, and Equitable AI Integration,” serves to support education leaders in adopting AI in the classroom in a way that protects students.
Together, Basham and Lohoefener are looking at modeling classroom experiences to help develop AI that enables students with disabilities to be more successful learners. Part of the work involves trying to break down what artificial intelligence even means in terms of what we feed it, especially as it very human based.
“Much of AI research focuses on areas like self-driving cars and gaming systems, where humans interact with the technology but aren’t the primary subjects being modeled. In the classroom, however, we are increasingly seeing these two domains - AI and human behavior - becoming more closely intertwined,” Lohoefener said. “I think that's where we're headed longer term. There’s going to be less and less space between the human and the machine, and the question is, can KU get ahead of that from a trustworthiness or explainability perspective?”
Lohoefener said that AI has not only affected her research, but the entire computer science and engineering department. She said that she can see a future where there won’t be a need to teach computer scientists how to program because AI can easily create code through a prompt.
“Students need to have a basic understanding of how to write code and how to be software engineers, but it has to be more than that now. AI is completely shifting the curriculum that we will be teaching to them in the future, and what does that look like? And what does the delivery of that even look like?” Lohoefener said. “It’s sort of all of these ramifications and implications of AI that I find way more interesting than the technology itself.”

Looking Ahead
In March of this year, Dieker, Basham, and Eleazar “Trey” Vasquez III, a professor and director at the University of Central Florida who works closely with CIDDL, will be keynote speakers at the Council for Exceptional Children’s Special Education Convention and Expo in Baltimore, Maryland. The presentation, “Envisioning Tomorrow for All Learners: AI Revolutionizing Education,” will highlight the latest research in special education and how AI is transforming teaching and learning for students with disabilities.
Dieker said that events such as these are important for the future, as KU needs to be thinking strategically about how it moves forward with AI by offering support and seeking out partnerships and collaborations in the AI world. She also said playing into the university’s strengths and highlighting the way KU researchers are approaching AI from an educational and analytical perspective will give KU a leg up.
“The people in the Think Tank aren't all experts, but they're all open to becoming better users. The mistake people make is thinking everybody needs to be a creator and an expert, but we all need to understand the dangers of what we're using, and that AI isn't really that novel. We’ve been using it for 20 years, and now it's just got better toys and an upgraded playground,” Dieker said. “I think what is more important than being an expert or having all the new technology is to learn from each other. And that's what I think KU is very open to doing, and maybe that's not true everywhere.”
#
About Achievement & Assessment Institute at KU: The Achievement & Assessment Institute (AAI) is one of 12 designated research institutes at the University of Kansas. AAI and its centers partner with numerous agencies to improve the lives of children and adults through academics, employment, career advancement and building healthy environments, as well as to enhance the capacity of organizations that help children, adults and communities succeed.