This time last year, The Hatchet’s editorial board argued that generative artificial intelligence wasn’t the only cause of rising cheating. But the role of GAI on campus has changed since then, and GW’s policies and the way students approach it must change, too.
GAI has grown to become more integral to our lives, with more concerns about its use coming to light, like its detriments to the environment, data privacy and societal biases. Because of ChatGPT-4’s growth since its introduction in the fall of 2022, educators have started using it more for making lesson plans as students use it to aid assignments or studying — or cheating.
GAI is of course a helpful tool, but students are relying too much on it, from using it as a search engine to plugging in essay prompts when writing papers. It’s no longer about asking GAI to simplify a complex topic. This dependence has devalued our own educational labor and disrespects the professors who take their time to give feedback on assignments that were copy-pasted in a matter of seconds. As GAI’s dominance over higher education continues to swell, GW shouldn’t be shy when instituting stricter rules for its usage. As for us students, our editorial board cautions against prioritizing convenience and ultimately cheating ourselves out of an education.
The University considers the representation of GAI work as one’s own as cheating but not explicitly plagiarism. We agree that certain uses of GAI are appropriate — and even commonplace in tech — but we draw a line at using the program for written assignments. When students substitute GAI-written work for their own, that’s plagiarism, and GW should treat it as such.
Even if there’s not a concrete, major change in GW’s penalties and policies for GAI use, we shouldn’t underestimate the impact of the University issuing a statement or disseminating through professors that officials will treat any use of GAI on written assignments as harshly as plagiarism, which could lead to tangible academic punishments. Having a stricter and more direct message on the issue could reduce the number of overall academic integrity violations, alleviating the shortfall in faculty members who help handle these cases’ adjudication. Fewer students will let a robot write their essay if they know it could imperil their academic future.
Beyond potential punishments, using a robot for written work puts our education at risk. We come to college to be challenged and become critical thinkers. We’re pushed to analyze, discuss and even dispute ideas and literature. So when students start handing off those tasks to GAI, we are the ones losing out. The knowledge that costs us tens of thousands of dollars every school year is not being actualized. And if we allow GAI to do our classwork, we’re granting our future employers permission to use GAI instead of hiring us.
We have to retain and treasure a sense of pride in our work as students. Cranking out that 20-page paper is exhausting, but when we get a good grade, it’s worth it. When we deeply research something, it’s evident that we’ve learned — we brainstorm more original ideas and remember the texts we’ve read for years to come. We don’t want employers to hire us for the feeble skill of typing a few sentences into ChatGPT-4 and copy-pasting the response into a Word document. It’s time to accept that using GAI to write your essays, assignments and discussion posts is not much better than boldly copying an assignment from a classmate or website. That is not your work, you’re not learning and, truthfully, your grade isn’t earned.
We also must consider our professors, who spent hours reading and researching to achieve their degrees, all without the help of tech tools like GAI. Before submitting a jumbled mess of vague arguments to Blackboard thanks to ChatGPT, think of the time that professors spent cramming at the library during their undergraduate and postgraduate careers.
The editorial board understands that GAI is here to stay — and so does GAI. When we asked ChatGPT if the University should ban GAI, the machine said the policy would be “extreme and counterproductive.”
“Universities have always adapted to new technologies — from calculators to the internet — and AI should be no different. What do you think?” ChatGPT said in response to our query. But children learn to manually add, subtract, multiply and divide before their teachers hand them graphing calculators. The more power you give GAI in your education, the more power you take away from yourself.
The editorial board consists of Hatchet staff members and operates separately from the newsroom. This week’s staff editorial was written by Opinions Editor Andrea Mendoza-Melchor, based on discussions with Contributing Culture Editor Caitlin Kitson, Research Assistant Carly Cavanaugh, Copy Editor Lindsay Larson, Culture Editor Nick Perkins and Sports Columnist Sydney Heise.