Angles: Should ChatGPT be considered plagiarism?

Angles: Should ChatGPT be considered plagiarism?

Technology has become an integral part of education over the past few decades.

Now, it is hard to imagine a class that doesn’t require the use of a computer or of the internet.

The introduction of ChatGPT by OpenAI, however, is sparking debates among educators.

ChatGPT is an artificial intelligence program developed by OpenAI that can produce pieces of writing based on user inputs. The program uses large amounts of language data to produce believable sounding writing in a wide range of styles, from poetry to journalism.

There are limits, of course. It can’t access data pulled from the internet, so it can’t really do its own research, and OpenAI has admitted that sometimes the program produces responses that contain factual errors or are nonsensical in some other way.

The use of ChatGPT in an educational setting is, for good reason, controversial, and the question of whether the use of ChatGPT is plagiarism is still very much up for debate.

AI provides educational opportunities

By: Slater Dixon

ChatGPT is, understandably, freaking out a lot of educators. Though many have folded the technology into existing courses and syllabi, others worry that our systems for producing and communicating knowledge will be fundamentally broken. Those fears underpin the most extreme stances towards the technology, including the stance that any use of ChatGPT is categorically plagiarism.

This position is obviously suspect because of the good-faith applications of the technology. A student who submits an entire essay copied from ChatGPT is intuitively committing plagiarism. But there are several other use cases that aren’t so obvious.

For example, ChatGPT can generate an essay outline given a thesis statement. It can rewrite a paragraph to be more concise or with a different tone. It excels as a reverse dictionary. And for full length essays, it provides a mundane “meets expectations” product that helps me realize the dullness of my own draft.

Are these applications inherently plagiarism? How do educators decide?

Given the myriad good-faith uses of ChatGPT, categorically opposing the technology as “plagiarism” is untenable. But this stance also misunderstands the value of education.

ChatGPT excels at processing language, but regularly messes up basic facts, invents academic literature and fails to provide sources. Its writing is polished and articulate, but the information is often completely wrong. That doesn’t mean it should be banned from the classroom — it means educators need to be clear about the importance of vetting information, especially when using tools like ChatGPT. And as massive quantities of AI-generated noise become mundane, critical thinking will be more important than ever.

Is the purpose of education to inculcate students with information and help them sound smarter in essays? Maybe, but it’s also to build intuition for systems and processes, encourage a genuine curiosity about the world and help create value. There are ways of testing those skills that can’t be faked with a chatbot.

This is not to say that ChatGPT-like technologies won’t break things. Professors will undoubtedly receive full essays copy-and-pasted from ChatGPT — they probably already have and can probably tell. It’s easy enough to dismiss these situations as plagiarism and move on. However, this blithe approach conveniently avoids a deeper reckoning with the effectiveness of our current systems. It doesn’t just ignore the value of traditional education methods, but also their shortcomings.

Discussion boards and posts appear especially prone to dishonesty in the AI era. The bot is great at 400-word summaries of texts and basic observations. It easily beats the

average quality of my posts, which are usually hastily written after skimming the readings. But is this a startling example of the existential threat posed by AI, or an indictment of my personal work ethic?

Using ChatGPT reveals uncomfortable truths about my own learning habits, and could do the same on a larger scale with bigger systems.

I often treat my brain like a meat-powered ChatGPT, spitting out the first coherent thought that comes to mind. What results is definitely a series of words in proper American English, but they don’t reflect the type of serious critical thinking that leads to interesting ideas.

Large language models, like the one that powers ChatGPT, are the type of technology that triggers strong reactions, and drawing lines in the sand is part of that process. However, where we draw those lines is important. While ChatGPT presents real academic integrity challenges, it also provides a unique opportunity to explore new ways of thinking and learning. By dismissing any use of the technology as plagiarism we’re ignoring that opportunity.

AI in education constitutes plagiarism

By: Andrew Kronaizl

In recent decades, advances in technology have constantly influenced the field of education.

While any new technology raises issues, this influence is usually for the better. Think of how different the classroom dynamic was before laptops, Google, Wikipedia and the like. I’m grateful for how streamlined these improvements have made the learning process.

Since its launch in late 2022, ChatGPT has presented a similar mixed bag of possibilities, although the pros and cons have been murkier. While AI provides a buffet of benefits for learning, it raises serious concerns on the nature of academic integrity. Given how ChatGPT generates content without human interaction through hidden processes, academic use of the program should fall under the umbrella of plagiarism.

One of ChatGPT’s biggest selling points is that the content it creates is unique, which makes it seem like it should be safe from plagiarism scrutiny. In an article for The Guardian, technology editor Alex Hern writes that “the output of ChatGPT hasn’t triggered any conventional plagiarism detectors up to this point, since the text it produces hasn’t been written before.”

Looking at plagiarism from a very literal standpoint, users could argue that since ChatGPT creates brand-new content — whether that be an essay, code or whatever academic pursuit it’s used for — it’s not plagiarizing anything. In reality, this is a dire threat to academic integrity, where the root issue is the lack of input by the user.

If a student were to take someone else’s assignment and pass it off as their own, that would be a clear violation of plagiarism policy. Someone else put in the effort, and the student tried to keep that hidden. With AI-generated content, the situation is the same. While a program completed the assignment rather than a person, the lack of effort and resulting deception is still a breach of honesty.

Another threat to academic integrity lies in how ChatGPT generates its info. Even if a student wanted to be involved in working on an assignment, ChatGPT is a machine-learning program that’s inner-workings are kept secret. As a result, it actively prevents users from participating in the academic process. Due to its nature as AI, any resulting work can be riddled with uncertainty, and that unknowing makes it hard to be sure of the integrity of the work.

This is in sharp contrast to an academic tool like Wikipedia. While copying and pasting a solid paragraph from a Wikipedia article is clearly plagiarism, the website provides more opportunities for research. Users can find a section they’re interested in, scroll to the sources at the bottom, then springboard off of those. In this case, students are using a resource while still being involved in the process.

As for ChatGPT, the presence of AI pushes the student out of multiple stages of the process. Even if a student wanted to use the program solely as a tool, the information it provides is ambiguous and should be dealt with caution or outright avoidance. According to OpenAI’s website, the program “cannot access the internet, search engines, databases or any other sources of information outside of the current chat. It cannot verify facts [or] provide references.”

Ultimately, time might change education’s viewpoint on AI tools like ChatGPT, much like it did to other technologies before. Perhaps one of the best things that will come out of educators grappling with this new technology — and something the current conversation around AI is already sparking — is the reassessment of what plagiarism entails in our rapidly evolving landscape.