Stearns Center offers guidance on generative AI tools


In September, UNESCO reported that the “education sector is largely unprepared for the ethical and pedagogical integration” of generative artificial intelligence (AI) toolswhich produce a variety of new output such as images, video, music, speech, text, code, or 3D renderings—in schools. The UNESCO article also referenced a recent global survey of over 450 schools and universities that found less than ten percent reported having “institutional policies and/or formal guidance concerning the use of generative AI applications.” To complicate matters, universities have varying and evolving responses to the use of generative AI that range from total bans to encouragement and incorporation.

As for George Mason University, “There is no ‘one policy to rule them all’ that faculty can use to respond to such a complex, evolving set of tools and opportunities,” said Shelley Reid, executive director of the Stearns Center for Teaching and Learning. “The stakes are high, and it can feel like the tools and expectations are changing daily.

As one indication of the need for transparent policies, Faculty Senate recently received a resolution from student government requesting clarity regarding generative AI policies in the classroom. The resolution was passed to the Academic Policies Committee for further review. To assist faculty, the Stearns Center for Teaching and Learning now offers opportunities for faculty to attend workshops that explore generative AI tools to help students produce authentic, high-quality work.

“Generative AI tools produce pretty accurate responses to questions that ask for a summary or description of general concepts,” said educational developer Laina Lockett. She noted that when incorporating AI into their courses, faculty can adjust assignments to reflect students’ analytical, integrative, or applied thinking and encourage them to learn and practice with the available tools. Until a formal university policy is established, Lockett said it is important for faculty to explain why students can or should not use these tools. 

For faculty who believe student assignments have been generated with AI tools in ways that go beyond what their classroom policies allow, incorporating ‘AI detectors to determine cheating is not recommended.

“The detectors are unreliable and can reproduce social biases about what constitutes good writing,” said Tom Polk, director of Writing Across the Curriculum. “If faculty feel comfortable, they can contact the student to ask for more information about how the work was composed, or simply evaluate the work based on the stated criteria, since generative AI tools often fail to meet nuanced expectations.”

In preparation for the spring semester, faculty are encouraged to engage with and learn more about the technology’s affordances and constraints which can then influence how they design a policy that works for their courses. When doing this, it is important for faculty to also create assignments that invite complex thinking to match their course goals and to share their expectations as transparently as possible,” said Crystal Anderson, Stearns Center associate director.

The Stearns Center website provides sample policy language and other resources to help faculty plan, including an article by Stearns Center educational developer Rachel Yoho. Academic units that would like to hold more area-specific conversations should contact the appropriate Stearns Center liaison.