Learning To Use, Not Abuse, A.I.

Phillip Cozzi, Editor

“It’s not plagiarism. It’s just not their own work. It’s cheating of a different kind… ChatGPT just kind of like stashes stuff together and creates original quote-unquote content,” said Dr. Richard Wallace, an adjunct professor in Brookdale’s philosophy department, in response to the ease in which students can produce low-effort content for academic submission on the A.I. program ChatGPT. 

“My understanding is that if you as an individual, you the same individual, at 10:30 in the morning hit ChatGPT with five or six keywords and get a response, OK? And you say, OK, fine. And at 10:35 that same morning if you do the exact same thing with ChatGPT, you’re going to get different results. It’s not going to be identical.”

Wallace’s distance-learning philosophy class is uniquely tuned to infiltration by ChatGPT and other A.I. submissions. The tests are open-book, written exams that users of ChatGPT could attempt to cheat on by having the A.I. generate their submission. “These are students I don’t even see in the flesh, so it’s hard to ascertain anything about them personally, but it seems to me that it’s obvious when you have a discussion forum online each and every week, which they’re required to participate in, when all of a sudden the tone of their submissions change,” Wallace said.

Wallace is far from the only Brookdale professor concerned about A.I. as a burgeoning technology; it has become a bit of a communal experience across college campuses nationwide to see students using the program to cheat. Stephen Fowler, the campus’ instructional designer, and Anthony Lorenzo Giachetti, the campus’ learning space specialist, have both been heading up the regulation and integration of A.I. into the learning systems here at Brookdale, in order to prevent the low-effort cheating that has become synonymous with the program.

“So, I mean, you know, the art department has been dealing in their own way with A.I. and  students submitting A.I. generated work. They’ve been more open to this though because it could be the nature of the way that things are created and produced in art that’s a little different, but it is different though for the other departments here at the college,” Fowler said.

“When we’re looking at ChatGPT, you can’t use the same strategies as you would with plagiarism. It’s harder to identify, it’s easier for students to generate a lot of content from ChatGPT and submit that, and it would be very difficult for faculty to know if it’s GPT or not. We do have certain tools in place, and they have been flagging certain submissions as GPT.”

Fowler has been working to prepare the learning infrastructure at Brookdale for ChatGPT’s future iterations, as its creators continue to allude to more advanced features and options being available to the consumer in the near future.

“You can ask it to do so many things, and it does it very effectively. And a lot of times, the faculty would say, this is like an A-plus submission in some of my more advanced 200s classes, and this is giving me back something that I would expect from a B-plus or an A-minus type of submission. That’s what level this is at. So immediately we started to recognize the impact this is going to have this semester, and we’re already thinking about the summer, and the fall, and the spring, and how we’re going to help prepare the faculty to not only maintain integrity in their course, we don’t want a lot of students submitting GPT-generated content as their own, but also what we’re really starting to look at now is how to leverage GPT, how to teach students how to use it.”

Fowler alluded to many uses that ChatGPT has outside of the range of pure content generation. He said the technology is too new and too expansive that banning it outright simply wouldn’t work. Fowler is trying to incorporate the program into teaching strategies in order to make the program more tenable to the goals of the higher education industry.

“We want to prepare students to go out in the world, be able to figure things out with clarity, and be able to do their job effectively, to live a fulfilling life. So, we want to prepare them in every way we can. If this tool is going to be the tool that is going to be used in every workplace, well then, we also want to prepare them for that,” he said. “So yeah, on one hand we want to maintain integrity and make sure that students aren’t using this tool to cheat their own education. We want to teach the students how to use this tool so when they leave, they are not A.I. deficient, right?

Fowler placed a lot of emphasis on using ChatGPT to foster the growth of critical thinking. Instead of having the program do much of your critical thinking for you, the program should be used in such a way that it hones someone’s intellect in knowing the right kind of questions to ask.

Coming from a background in the college’s art department, Lorenzo Giachetti wants to foster the creative core he feels A.I. might be able to encourage in students. “One thing I’ve seen firsthand back when I was a tutor, assistant instructor in the art department here, was that newer generations who’ve grown up with YouTube tutorials for art and all these different tools…that my generation probably didn’t have, they’re coming here as better artists than I was and what I was doing at that age,” Giachetti said.

“They’re coming into, even as college, with a full portfolio ready to go because they’ve been learning from YouTube tutorials since they were 10, 12, 11-years-old. I feel like A.I. is going to expedite that process more. We might be seeing now a generation of students coming in that already have fully rendered short films, animations, just right out of high school, or even before they finished high school.”

Giachetti emphasized the usefulness of A.I. tools in creative and artistic pursuits, for automating processes that would be especially labor and time intensive like generating water physics in a video game or 3D render.

“I think the ideal way to use it is that at the end of the day, A.I. is a tool. It’s a guide. It can be used for good; it can be used for bad; it can be used to help you; it can be used to hinder you. You know for education the ideal way to use it would be to help move you forward,” Giachetti said.

Both Giachetti and Fowler are optimistic for the future of A.I. in schools, but the technologies are so new that catching up and utilizing the technologies in these ways are a learning curve for both students and faculty alike. One sentiment about A.I. at Brookdale is universal: if you’re using it to bypass the process of learning, you won’t have grown or succeeded in the way you wanted to.

“I’m an old man at this point in time, O.K., so I have this value, if you’re asking someone or something else to do your work for you, you’re not academically applying yourself. You don’t belong. Pure and simple. That’s why for plagiarism, we’ll throw you out,” Wallace said.

Ethical and academic concerns are cropping up across campuses everywhere regarding the program. “There’s this old saying, with great power comes great responsibility. Automation is a perfect example of that. We can get a lot of things done with less effort. A lot of good things can be done with less effort. At the same time, the equal and opposite of that is we can get a lot of bad things done with very little effort. So, I would say as far as someone, if I were in Generation Z, I would be laser-focused on ethical standards for AI. I would be laser focused on what government is doing, what kind of legislation they’re looking at,” Fowler said.

The societal consequences, while weighing heavy on college faculty, might be less so for the creatives behind these A.I. products, according to several professors.

“And the outcomes are secondary to the marketability of the product. We are living in a capitalistic society. That’s what the dollar dictates. Our health care system should make that very apparent. If you know anything about health care anywhere else in the world, you know that we spend the most amount of money for not the least amount of care, but for significantly less care than what we spend money for. So yeah, the dollar rules. And that’s what’s dangerous about the situation from what I understand,” Wallace said.

Dangerous or not, Brookdale staff are still committed to giving students a quality learning experience with or without A.I. Faculty still expect students to operate with integrity and not abuse these products.

“We want our students to be able to understand that A.I. can do all these things, but humans…,we’re still the innovators, we’re the inventors. AI is not going to do that just yet. So, we have to have the original thought, we have to be able to use a tool like A.I., make it our own, and then be able to prove it,” Fowler said. “I don’t know if there’s anything that a company could do. I think that…so here’s a few ideas… like a statement of some sort. A heavy guide from companies tailored for our institutions, and higher ed institutions are starting to adopt language in the beginning of their syllabi to warn against A.I. cheating, similar to a plagiarism statement. We expect you as a student to conduct yourself with integrity and you are here because you want to learn, right? And if you’re cheating, you’re subverting that goal.”