In January, the University formed an advisory council of faculty from each school to provide input on artificial intelligence resources and training for professors.
The Instructional Core within the Libraries & Academic Innovation office formed the council between December and January to encourage faculty to provide feedback on their training needs when implementing and leveraging the use of AI in their courses. The council will include faculty representatives from each school and aims to align faculty training on AI with the guidelines released by the Office of the Provost in April, according to dean of LAI Geneva Henry.
Henry said the council will provide LAI with feedback on how to improve the resources the University provides to faculty on the use of generative AI tools in the classroom, as well as facilitating discussion on AI’s use in teaching and increasing faculty awareness of programming dedicated to using AI in courses.
“In response to the concerns and questions expressed by GW faculty, the Instructional Core has been extremely active in bringing together interested faculty for discussion and developing workshops and educational media to help faculty discover ways to respond to and leverage genAI in their teaching and student learning,” Henry said in an email.
The Office of the Provost released a set of guidelines in April for the use of AI at the University, which left the decision of whether or not to allow AI in courses up to instructors. The set of provided default rules states that using AI to study is permissible, but submitting AI-generated material for an assignment or using AI during an assessment is cheating.
Douglas Crawford, a member of the council and an assistant professor of interior architecture, said the council will provide necessary guidance to faculty in addition to the “boilerplates” the University provides professors to use in their syllabi.
“It’s the Wild West if the University is not there to support the faculty and say, ‘Here’s something to start with,’” Crawford said.
Crawford said moving forward, the goal of the council is to act as a knowledge base for faculty looking to implement or restrict AI use in their courses. Crawford said concise language on AI rules is a useful resource, but the choice to use AI should be individual to each professor.
“There’s different use cases for these things,” Crawford said. “I think that needs to be sort of noticed or highlighted a bit more again, what these things are being used for in my program is a totally different product than what generative AI at large is being used for.”
Crawford said besides providing baseline guidance to professors, he hopes the council will promote a more open dialogue on the use of AI in education throughout the University and encourage the discussion of its potential benefits and drawbacks. He said he encourages his students to use AI as a starting point for projects because it can provide more tailored inspiration than platforms like Google Images or Pinterest.
“If it was regulated or withheld from us, I would see that as a huge detriment,” Crawford said. “It would be like removing Photoshop and Adobe Illustrator and other that my students use to create graphic content.”
John Helveston, a member of the council and an assistant professor of engineering management and systems engineering, said the introduction of AI into classrooms has pushed educators to rethink what they want to accomplish in classrooms and how they organize course content in order to make students think critically. He said content the council can produce, like video demonstrations and workshops, could make information about AI and its incorporation into academia more accessible to faculty.
“This council is a self-selected group of people who are interested in this topic, who want to learn best practices,” Helveston said. “So the tools that we come up with, we want to make those accessible to people.”
Helveston said the council brings together faculty across a range of disciplines, which allows members to “organically” learn uses and practices for AI from each other. Helveston added that the interdisciplinary interaction can help professors who aren’t as knowledgeable about AI to gain a deeper understanding of the tool and its uses, especially in educational contexts.
“We need to learn from each other how we’re using it. I want to hear from other faculty on ways that they’re experimenting with it to see what’s working, what’s not,” Helveston said. “It’s also new and things are changing so fast that I think if we just stay in our own silos, we’re not going to see the latest thing that’s happening.”
Lorena Barba, a member of the council and a professor of mechanical and aerospace engineering, said the council had a “grassroots” origin that bodes well for faculty’s willingness to collaborate in discussions surrounding AI in a way that hasn’t been commonly seen at universities.
“GW has a unique opportunity to be at the forefront, and many members of the AI Advisory Council are courageously embracing the challenge,” Barba said in an email.