Artificial intelligence, or AI, was the stuff of science fiction, but it’s transitioned from far-flung fantasy to something real and accessible. Like any new technology, AI introduces exciting possibilities — but also understandable anxieties.
At universities, AI is drastically improving productivity and introducing new ways for students to learn, but many fear allegations of academic dishonesty. As AI becomes ingrained in the college experience, educators must be clear about what's allowed and what's crossing the line.
This article explores ways students can use AI in their coursework to enhance learning and productivity while maintaining good ethics. We asked Dr. Sudeep Sarkar, distinguished professor and launch dean of the Bellini College of Artificial Intelligence, Cybersecurity and Computing at USF, to help us navigate these ethical considerations and understand the role of AI in education.
This article centers around generative artificial intelligence, or genAI. As defined by our university,
“GenAI refers to technologies that can automatically generate new, original content and assets including text, images, audio, video, and computer code. These tools work by analyzing patterns in training data, building an understanding of structure and style, and using that knowledge to create novel, customized outputs that mimic the training data while introducing variation and new ideas.”
Well-known tools like ChatGPT or Grammarly are good examples.
Students at universities around the world are finding practical ways to use genAI in their coursework, and these are widely regarded as totally acceptable:
However, the last bullet point may trouble professors. GenAI can be an extremely helpful tool for writing, but how much help is too much? If a student asks AI to generate a new paper using just a few prompts and submits it as their own, what have they learned?
Writing plays a key role in higher education. It’s a process that requires students to comprehend a topic well enough to explain it, take a position, and defend their thesis with evidence. It’s both an exercise in critical thinking and the path to mastery of a subject. If ChatGPT does the work instead, something essential is lost.
Then again, people said calculators would eliminate math skills. And others worried that spell checkers would keep people from learning to spell or proofread (a little more credible, but not universal). Technological advancements and anxiety go hand in hand — but people tend to adapt to the new status quo.
Dr. Sarkar has spent a lot of time considering this balance. “The AI space is evolving right now,” he explained, “and we have to come up with ways to enhance learning without replacing the thinking that goes on in the classroom.”
To that end, don’t make genAI write your paper for you. Instead, let it help you through the process:
The writing process is a continuum. You can choose to use genAI from start to finish or bring it in to assist at any point along the way. It’s up to you! The process is what matters, and students aren’t expected to go it alone. Before genAI, you could ask for help from teachers, peers, or the campus writing studio. Using genAI is just adding another coach to the team.
“At the end of the day, the student should own the synthesis, the citations, and the final argument that’s made,” Dr. Sarkar contended. “They should be able to defend their work. They are responsible for the overall product.”
That includes any errors.
GenAI seems like magic, but it has limitations. If it makes a mistake and you don’t catch it before submitting, you’ll be penalized. With this standard in mind, Dr. Sarkar urged students to beware of hallucinations, which produce unreliable information.
According to IBM, hallucinations occur when genAI “perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate.”
“The language used sounds confident, so you tend to think it’s right,” Dr. Sarkar said. He suggested a few ways students can avoid pitfalls:
Remain skeptical, because genAI may just be delivering results that please you. Algorithmic bias exists, and small errors can ripple out to produce some real wacko results. If genAI fails these tests, “you should wonder why it’s not confident in its answers,” Dr. Sarkar concluded.
Dr. Sarkar was very clear about his ethical standard for genAI use: "Never, never submit AI output as your own work."
Let’s make that our North Star as we navigate some specific ethical concerns related to genAI.
If students are using genAI as a tool for schoolwork, some part of the deliverable is going to be AI-generated, whether it’s a kernel of an idea, an outline, or full lines of text.
Educators have been sniffing out plagiarism for ages. Have you ever submitted a paper through Turnitin? The software has been used for decades to analyze writing and give an “originality” score. Policies established by the school or individual teachers determine what scores constitute academic fraud. Quotes from other work must complement your own thoughts, not replace them, and you must properly cite each source.
Software like Turnitin is now being used to detect AI plagiarism — flagging recognizable argument structure, language patterns, and word choice, among other factors. Results are mixed. Reports of false positives are common and accusations could be ruinous.
Here’s how you can reduce the likelihood that your writing gets flagged for plagiarism:
For other kinds of assignments, do as much as you can the hard way and consult genAI mostly for pointers and feedback. Again, track the way you use the tool throughout the process and be transparent.
Here’s an example: Dr. Sarkar requires students in his courses to disclose which parts of their final computer code are AI-generated and which are original work. "I grade them based on the part that they contributed,” he explained. “They might have used AI to ideate or whatever. That’s fine, but what is the final part that you did?"
If you are the victim of an AI plagiarism false positive, remain calm. It’s an indicator, not a verdict. It’s likely that your professor will reach out to discuss the issue, giving you an opportunity to defend your process. Show them you’ve been making a good faith effort to learn. Your professor will judge whether you should be penalized, and your honesty and effort will go a long way towards a favorable outcome.
Dr. Sarkar advised students to trust their conscience.
“I firmly believe everybody knows when they’re being dishonest,” he said. “If you think you’re crossing over into dishonesty, don’t do it.”
When you have doubts, ask your professor for clarification. And always, always check the course syllabus for an AI policy. Almost all instructors include them now. Some may not allow AI use at all! You must respect their policy.
If your school has an overall AI policy, you must abide by that, too. USF leads in this frontier, providing detailed guidance for genAI use by both students and instructors.
These policies will continue to evolve over time, and your feedback matters. Tell your professors how you’re using genAI and where you’ve drawn ethical lines for yourself.
Holding yourself accountable will help you get the most out of your college experience. You’re paying money to learn, after all. "If somebody is engaging in dishonest behavior, they’re cheating themselves,” Dr. Sarkar pointed out. “They might get an A, but if they don’t learn, they’ll fail in the job interview."
I was curious about genAI’s function as a writing assistant, so I gave it a try for this assignment. Microsoft Copilot was hyped to be included: “That’s a great idea, Joni — using AI to write an article about using AI ethically is both meta and practical!” My thoughts exactly.
As a newbie, I focused on the basics. Here’s my disclosure:
I gave genAI the most time-consuming and potentially frustrating tasks, while I focused on developing my ideas. Its outputs weren’t perfect, but I know I’m not a perfect prompter. I’ll get better with practice. For now, I’m grateful that Copilot saved me time and lightened the load. My creative process felt more directed than usual, and words came more easily.
Importantly, none of these uses ever felt like I was verging on unethical activity. (With any luck, my boss agrees.) Copilot simply helped me write a better article in a little less time.
Emerging technologies like genAI have their place in higher education, even if we’re still struggling with the specifics. “We will figure out a way to integrate AI to enhance learning,” Dr. Sarkar contended. “We’re already starting to figure things out.”
He sees a future where teachers use AI to design and teach lessons. Students will practice for exams with the help of AI assistants and AI tutors. AI will be injected into traditional topics, and each discipline will determine its own AI methodology. Assessment standards will adjust, too. “You’re going to see more oral exams, student presentations, and project artifacts," Dr. Sarkar predicted.
Training students in the use of genAI is essential. Dr. Sarkar and his colleagues believe it is already a core competency across curriculums. That’s why the Bellini College has begun offering microcredentials, artificial intelligence minors for non-technical degrees, and blended degrees. By enrolling in one of these paths, students in any discipline can gain proficiency in AI use and design within the workplace.
"AI allows you to learn anything, anywhere, any time,” he observed, “and build anything, anywhere, any time. What AI is doing is accelerating not only our access to information, but the speed at which we become more intelligent and capable.”
In response, we should remain open to the possibilities but further improve our critical thinking skills. We should acknowledge its limitations as well as our own, holding both to a high standard of performance, reliability, and of course, integrity.
"The best use of AI is when people use it to augment what we’re doing, not replace us," Dr. Sarkar said.
When you’re considering genAI use for an assignment, keep this checklist handy. It’s inspired by this detailed AI guide from the Mississippi Association of Educators and Dr. Sarkar’s suggestions.
The Bellini College of Artificial Intelligence, Cybersecurity and Computing opened Fall 2025 and is accepting students now. If you want to dive deeper into the topic of AI, including its foundations, functions, and ethics, the Bachelor of Science in Artificial Intelligence (BSAI) or a minor in AI could be the perfect way for you to prepare for an AI-driven future.
Start your journey today!