Serving the GW Community since 1904

The GW Hatchet

AN INDEPENDENT STUDENT NEWSPAPER SERVING THE GW COMMUNITY SINCE 1904

The GW Hatchet

Serving the GW Community since 1904

The GW Hatchet

NEWSLETTER
Sign up for our twice-weekly newsletter!

Professors to add AI exercises to courses, syllabi

Professors say AI is here to stay, so some are embracing it in their courses.
OpenAIs+ChatGPT+program%2C+which+professors+are+now+working+to+harness+as+an+academic+tool.
Sage Russell
OpenAI’s ChatGPT program, which professors are now working to harness as an academic tool.

Professors are implementing artificial intelligence technology in their courses this fall to supplement students’ studies and acquaint them with AI’s growing prevalence in the workforce.

Five professors said they are integrating AI materials into their courses and syllabi to provide students with coaching resources for assignments and have assigned AI-related exercises to stimulate discussion on the subject. Professors said students can use AI tools to assist with their academics, rather than as a strategy for cheating, which will prepare them for AI’s expanding role in the professional world.

The professors said they were generally satisfied with the AI guidelines the Office of the Provost released last spring, which left the decision of whether or not students may use AI materials up to individual instructors and provided a set of default rules in case professors did not create their own rules. The default rules state that submitting AI-generated material for an assessment or using the tools during an assessment is cheating but permit using AI to study for assessments.

The professors added that AI can be a useful tool for students to help them learn class materials and learning to use AI will be an important and desired skill in the workforce once students graduate.

Alexa Alice Joubin, an English professor and the co-director of the Digital Humanities Institute, said she started using AI tools in her classroom in 2021 to help students with their writing assignments and design research questions. She said regenerative AI — which responds to changing situations — is often subject to “hype” in the media and academics that makes it seem capable of writing entire papers, when in reality AI is still at a “very basic” stage.

“It’s a tool,” Joubin said. “But it can accomplish only very limited tasks, not discursive tasks.”

Joubin said she asked an AI system to design a frequently asked questions section for her syllabus and uses the tool Packback, an AI writing tutor, in her classes to provide writing coaching, like grammar and style corrections, to her students. She added that she uses AI with students to help them refine their research questions to be more open ended, instead of yes-no questions or questions that might not suit research topics.

Joubin said students should disclose when they use AI as a tool for writing, akin to how researchers disclose what scientific equipment they use to complete experiments.

Joubin said she has her class complete multiple discussion boards instead of a midterm paper and converted her final exam into a two-minute film project to ensure her students produce “meaningful” and “personal” assignments. Joubin said she partly made the change to prevent cheating using ChatGPT because she feels students won’t outsource work they are passionate about to AI.

“People write better when they are debating, discussing, they genuinely have something to say from their heart as human beings,” Joubin said.

Despite some faculty’s willingness to integrate AI into their classrooms, many professors have raised concerns during Faculty Senate meetings over how students’ use of AI materials can violate academic integrity rules and how AI will affect their courses if used to cheat.

Ryan Watkins, a professor of educational technology leadership, said he started allowing students to use ChatGPT last spring for parts of assignments so students could interact with the platform. He said one of his assignments last semester included a section where students had to have a discussion with ChatGPT and then talk about what they thought the “implications” of ChatGPT were.

Watkins said he hopes professors are “intellectually engaging” with AI materials even if they don’t allow students to use them in the classroom. He added that he hopes professors consider how professionals might consider AI use as a skill in the workforce when students graduate.

“I don’t think they’re doing a service to our students not to keep up on this technology,” Watkins said. “It is much bigger than the cheating issue. It’s substantially changing the field that I work in already, and it’s only a year out.”

Watkins also said introducing AI materials in the classroom presents instructors with the opportunity to refocus what they want students to learn in their courses.

“What it is that we want students to learn and be able to do when they leave the classroom should be our focus,” Watkins said. “And if those are things that a tool like ChatGPT can already do, then is it really something that we want to be spending our time helping students learn to do?”

Kathryn Kleppinger, an associate professor of French and Francophone studies and international affairs, created an AI statement to be used in her classes this semester that she shared with her colleagues. The principles state that students can use AI to have discussions with in her classes but warn that AI often makes factual errors, like misattributing text. She added that students miss out on developing literary analysis skills when they rely on AI tools like ChatGPT in their writing assignments.

“AI is here to stay,” Kleppinger said. “The most important thing we can do is think carefully about what we’re using it for and why.”

Kleppinger said she also gave her students the option to critique ChatGPT’s summary of one of the books they were reading as one of the class’s informal reflection papers. She said students could ask ChatGPT to write a summary of one of the books they were assigned and then critique the summary that it gave them.

Kleppinger added that the University’s template for academic AI principles are “enough” because of how many different subjects and classes they must apply to but that it took officials a “little while” to come out with the principles.

“It kind of felt like we were scrambling a little bit in January, but on the other hand, I also get it, they can’t develop an entire policy overnight,” Kleppinger said.

John Helveston, an assistant professor of engineering management and systems engineering, said he incorporates a few problems using ChatGPT into his student’s homework assignments to help them understand the coding problems they’re solving and thinks of ChatGPT as a “private tutor” for his students.

Helveston said he also hosts workshops about how to accelerate coding with AI and that he thinks more schools across GW should be hosting workshops.

“When you leave here, all your colleagues in your first job you get are going to be using these tools,” Helveston said. “If you’re not better than everyone else at using them then you’re not gonna get the job, so I think it’s really silly to pretend that we shouldn’t use it at all in the name of academic dishonesty.”

More to Discover
Donate to The GW Hatchet