Following the frenzied interest in ChatGPT, OpenAI’s highly regarded chatbot, concerns over academic misconduct by using such AI-enabled tools have arisen. To avoid cheating, some Chinese social sciences journals have released instructions on using artificial intelligence (AI) writing tools, requesting authors who use such tools to acknowledge and explain the usage, or their papers will be rejected or withdrawn.
Instead of simply banning the tools, using them in a rational and scientific way can free scholars, teachers and students from repetitive and thoughtless work and learning so that they can be more devoted to complex learning and creativity, some education insiders believe.
On Friday, the Jinan Journal, included in the Chinese Social Sciences Citation Index (CSSCI), announced that any large language modeling tools (e.g. ChatGPT) will not be accepted individually or co-signed. If relevant tools have been used in the composition of the paper, the author should propose the usage separately in the paper, explain in detail how they used them and demonstrate the author’s own creativity in the paper. If there is any hidden use of the tools, the article will be directly rejected or withdrawn, read the announcement.
Also, Journal of Tianjin Normal University (Elementary Education Edition) on Saturday published an announcement, advocating the rational use of new tools and technologies, and suggesting authors to explain the use of AI writing tools (such as ChatGPT) in references, acknowledgments and other texts.
The journal noted that they will strengthen the review of academic papers, resolutely resist academic misconduct, and seek scientific, accurate, complete and innovative basic education research.
To battle the plagiarism, some schools in the U.S. have responded to ChatGPT by cracking down, according to the News York Times.
New York City public schools, for example, recently blocked ChatGPT access on school computers and networks, citing “concerns about negative impacts on student learning, and concerns regarding the safety and accuracy of content.” Schools in other cities, including Seattle, have also restricted access.
A U.S.’ online course provider Study.com recently asked 1,000 students over the age of 18 about the use of ChatGPT in the classroom. The responses were surprising. Around 89 percent said they’d used it on homework. Some 48 percent confessed they’d already made use of it to complete an at-home test or quiz. Over 50 percent said they used ChatGPT to write an essay, while 22 percent admitted to having asked ChatGPT for a paper outline.
While the chatbot is raising fears of academic cheating on school campuses, some education insiders believe introducing the AI-ended tools to the academics could even overturn the traditional education.
For researchers, finding valuable scientific problems and solving them in creative ways come first, and then writing academic papers. Some AI-enabled writing tools can create and polish words or even write long text based on a few keywords shortly, Zhu Wei, a profess or from the China University of Political Science and Law, told the Global Times on Monday.
For example, ChatGPT helps writers gather reference materials for their papers, which not only increases the efficiency of the work, but also allows researchers to devote more energy to more creative work.
Whether to use ChatGPT depends on whether there is cognitive investment in using it. For those students who just use it to cope with homework or exams, Zhu said this is putting the cart before the horse, which should be restricted.
To avoid academic misconducts through using the tools, Zhu suggested introducing software to help educators sift through text generated by ChatGPT.
Zhu believes schools should embrace ChatGPT as a teaching aid which could unlock creativity in students.
The core role of such tools is not to replace human thinking, but to enhance human thinking, Zhu said.