An Open Letter to Higher Education Adminstrators about AI in Higher Education
There's a serious threat to your academic programs and plans: the AI Bogeyman!
I teach at Georgia State University, one of the most forward-thinking, pro-student, pro-technology, pro-success higher education establishments in the world. I am constantly astonished by the level of research, innovation, and teaching excellence that happens at GSU, and I am proud to be a member of the faculty. Our college president, Brian Blake, has an ambitious plan known as the “Four Pillars”: Research and Innovation, Student Success, College to Careers, and Identity and Placemaking. His plans, like the plans of most college presidents, depend highly upon adoption by faculty and staff at the college and their translation of those goals into the corpus of our student body.
For twenty years, my university and universities across the nation have been using technology in our teaching, especially through learning management systems, which have not only revolutionized online teaching and accessibility but have also provided us with data about which students are succeeding, which students are falling through the cracks, and what can be done to help our students succeed. We have leveraged that information to understand the best techniques for teaching and learning, and we have endeavored to pass that valuable information on to our faculty through strong professional development programs in the science of teaching, learning, and technology.
Everything was going great until now. Unfortunately, there is a serious problem with the plans of college presidents, provosts, institutional researchers, strategic planners, assessment professionals, institutional effectiveness officers, access and accommodation counselors, and accreditation officers across the nation, who depend upon the technological revolution in teaching to advance ambitious goals in research and innovation, student success, and college-to-career programs. It involves Generative Artificial Intelligence (GAI), but it’s not what you think. It doesn’t involve “student cheating.” It is the opposite.
Unfortunately, the amazing strides we have made in teaching and technology are being reversed, as we speak, by professors who are now teaching without any technology at all. These faculty, the “AI-Luddites” (my term), are so afraid of the AI Bogeyman that they have begun regressing to pre-2000 educational techniques. Their fear that a student might use GAI to cheat in their classroom is pushing them to extremes. Many of my students report that their professors are making them write all of their papers and take all of their exams with pencil and paper in the classroom. One of my colleagues is teaching his students to write MLA papers. In handwriting. In class. Others are going back to bluebooks and requiring online students to come onto campus to take their exams. The trend is serious and ominous.
Here are some reasons the AI Luddite trend undermines educational progress and hits campus goals hard. Let’s take a hard look at how that attitude directly affects my own college president’s goals as an example:
PILLAR 1: RESEARCH AND INNOVATION
Understanding new technologies, like AI, is important. Research and innovation occur in environments where new information and new technology are encouraged and celebrated. How can a student, excited about GAI and it’s capabilities, find support for research and innovation in that field in an environment where faculty members are hell-bent on stamping out any mention of GAI, let alone encouraging the use of and innovation of that techology? It is not just humanities and social science professors that are freaking out about AI—their STEM colleagues are also imposing technology blackouts by eliminating the use of certain calculators and computers in math and science courses. This overreaction to GAI is like a wet blanket thrown over the fire of inspiration and experimentation that our students feel when facing the possibility of an GAI-informed future.
PILLAR 2: STUDENT SUCCESS
First-Generation and Low-Income (FGLI) students depend upon technology. We have been preaching about the opportunities available through Grammarly, Microsoft Word, and other grammar and spelling assistants for at least five years. Students who grew up in underprivileged circumstances often depend on these programs to compete on an even level with students who have not had economic and academic barriers to college success. Now, those students are being forced to handwrite their responses to a professor who is likely to count spelling and grammar errors against that student. Those students get lower grades, and they are less likely to qualify for scholarships and other financial support they depend upon to finish school. In addition, the feeling of being treated without trust to use GAI, a technology that is free and may help them achieve parity with their peers (for research, personal tutoring, etc.) may seem not only unhelpful but also demoralizing.
Data-driven student success programs need to measure important benchmarks. If students are forced to do everything by hand, their work will be suddenly absent from Learning Management Systems and other data collection instruments. This means that vital data that drives student success initiatives on many campuses will also be absent. Maybe we will still have some attendance records or grade records, but important information like how many times students have accessed LMS systems and the quality of the work submitted may no longer be available. In addition, institutional integrity may be strongly affected as groups of students who are especially fragile may fail to thrive in the anti-technology world of AI-Luddism.
COLLEGE-TO-CAREER
College-to-career programs need to focus on what employers need in the workplace. What do employers want right now? They want fully functioning employees who not only understand communication and technology, but they also understand and know how to use GAI. Because GAI is a productivity multiplier, they will be able to get far more out of an employee who understands and uses GAI than an employee who does not. I see very few transferable skills when I see a student handwriting an exam or an essay in a class in front of a suspicious and terrified instructor. I’m trying to envision a moment in their working life where they will be required to handwrite an important document, and I confess I cannot. It isn’t useful, and it isn’t normal. We should never adopt a lesson that does not have, in some part, a baked-in understanding of how that lesson will prepare our students for their future—either in academia or in employment.
IDENTITY AND PLACEMAKING
Disabled students depend on technology. Whether those students are physically disabled and cannot write long passages by hand, they have a sensory disability such as low vision or hearing disabilities, or that student is learning disabled or neurodivergent and cannot read or write without some technology interface, the AI-Luddites are destroying the students’ opportunity to participate in an academic environment where their accommodation is somewhat invisible to other students. When they are forced to be the only student in the room that is allowed to use technology, their accommodations become a badge of difference and, perhaps, embarrassment. In an AI-Luddite’s classroom, students’ accommodations are no longer transparent, and they are clearly experiencing the class in a different mode from their peers. Many students with hidden disabilities, like those with learning disabilities or arthritis, in an effort to “blend in” with their peers are less likely to use accommodations and try to “make it on their own” (Disabilities, Opportunities, Internetworking, and Technology). The student may have a significant struggle due to their disability, resulting in lower grades and fewer opportunities.
Although these AI-Luddites are not intentionally vandalizing the opportunities of our students and our institutions, they are proceeding in an illogical and harmful manner in their classrooms because they fear cheating. But cheating has always occurred, and it has never caused our faculty to completely disregard important tools and technology and abandon so many students’ opportunities before. This appears to be a wholly new phenomenon.
So, it is likely not cheating that they are so fearful of; it is GAI. Many of the AI-Luddites have no understanding of what GAI is, how it works, what it does, or how they should prepare themselves for a new world filled with GAI. What’s more, many don’t intend to learn about it. Instead, they hide in the past, adopting antiquated methods that will hobble the students in their courses beyond the possibility of cheating with GAI. They are hobbling these students’ abilities and futures, and they are hobbling their institution’s goals to serve their students with the best education possible. There are ways to teach with and against GAI in higher education, but one must know GAI in order to do this. Ignorance will only harm our students and our institutions. We need to make an effort to find these AI-Luddites and help them face their fear. It is important to our students, to our institutions, and to the future of higher education in general.
If you are interested in helping your own faculty learn about how to teach against and with AI, please think about inviting me to speak at your institution. Please contact me at mkassorla@gsu.edu