In the final months of the 2022-2023 school year, the AI writing tool Chat GPT was widely used by the student population to gain an advantage in school work. As the new school year started, AI continued to show its convenience and attractiveness to students. However, although it seems helpful, it may be detrimental to students’ learning. Not only is the information provided by these AIs incorrect at times, but it is also easily spotted by anti-cheating software. Additionally, programs like Chat GPT can easily be caught by teachers who will likely fail the student who cheated, giving them a zero on the assignment or even for the quarter.
So why is Chat GPT so commonly used? This AI can give students a well-thought-out answer to almost any question in a matter of seconds and seems to be able to help students with virtually any type of learning needed. Despite this, the major problem with Chat GPT is that it will provide false information because it cannot actively surf the internet. The Register found that 52% of Chat GPT answers are incorrect and that 77% are too wordy. Although many answers are false, many people, including students, still choose AI over doing the work in other ways as it is much quicker. This leads to incorrect assignments and the loss of many points on other work.
While students might get in trouble with specific teachers for using an AI like Chat GPT, there is also a policy banning AI usage. The 2023-2024 Pittsburgh Allderdice High School academic honesty policy (that all students are required to sign) includes that, “Using a natural language processing tool, such as Chat GPT, would also violate the academic honesty policy.” This official rule punishes students from using these engines to cheat during school.
Many PPS students wondered why the district banned AIs in the first place. To many, it seemed to just be another tool that could help students do things such as research or study for an upcoming test or project. However, several sources claim these bots can be detrimental to kids’ health and can prevent students from certain opportunities in the future by not allowing students to learn specific skills they will need in life. An article from Moonpreneur said that AI could “lead to a greater dependence on technology, potentially resulting in a decrease in critical thinking and problem-solving skills.” This shows that students could become dependent on technology and not be able to solve any problems in the future without it, stunting their psychological growth.
A sophomore at Allderdice who was caught using Chat GPT in the 2022-2023 school year said, “I thought that it would be helpful so I could finish early but I just ended up with a zero and in trouble.” This shows that many are using it for time efficiency as it can come up with any answer they need in a matter of seconds.
On the other hand, Ms. Mazzacco, a teacher at Allderdice, said when talking about the use of these AIs, “I tell kids not to do it because I can easily catch it. I know it is hard to prove, but the writing normally does not fit the skill of the student.” The AIs will often write articles that are well above the skill of the students submitting them, making it very obvious when grading that one might be cheating.
Ms. Galloway Barr, another English teacher at Allderdice said, “What I’ve had to do is require most work to be handwritten and turned in hard copy- without the ability to work on it outside of the classroom…I want to be sure that the work that I am getting is the actual work product of the child.” This was a straightforward solution that Galloway Barr had to the problem. She also talked about how she does a lot of project-based learning in her class so that the students can show a deeper understanding of the topic, which she says, “is more fun, and often more effective than tests or lengthy written responses.”
While many students see Chat GPT as an easy tool to get quick answers for school work or projects, many teachers say that it can be easily caught and will hurt them in the long run. Although many of the answers provided by AIs like these can be incorrect in the future as these programs become routine, many may still use them to their advantage while others will still choose to stay away.