
By Omwattie Kissoon. Artificial intelligence is reshaping education, sparking debates about its benefits, risks, and ethical implications. At Queensborough Community College (QCC), faculty, administrators, and students are struggling with AI’s role in learning, academic integrity, and instruction. As AI-powered tools become more accessible, QCC has begun implementing policies to guide its use, reflecting broader trends in higher education.
The Institutional Response: QCC’s AI Policies
In response to the increasing presence of AI in academic settings, QCC’s Special Committee on Pedagogical Response to Artificial Intelligence has taken significant steps toward regulating and supporting AI use. A key resolution mandates that each academic department establish its own AI policy, ensuring clarity on acceptable and unacceptable uses within specific disciplines.
“AI is not a one-size-fits-all tool,” said Professor Karan Puri from the mathematics department and a member of QCC’s committee for AI. “In some courses, it can enhance learning, while in others, it raises concerns about academic honesty. That’s why having department-specific policies is crucial.”
Additionally, the Office of Academic Affairs (OAA) has been tasked with providing ongoing guidance and support for faculty navigating AI-related challenges. This initiative acknowledges that many educators are still adjusting to AI’s rapid evolution and need organizational backing.
“One of the biggest issues is that faculty haven’t received much formal guidance on AI,” noted Dr. Christina Saindon, Assistant Professor in Communication, Theatre, & Media Production (CTMP) and also a member of QCC’s committee for AI. “It’s essential that the administration steps up to provide resources so instructors can integrate AI effectively—without compromising academic integrity.”
Beyond policy, QCC’s Center for Excellence in Teaching and Learning (CETL) has committed to offering professional development opportunities focused on AI literacy. Many faculty members recognize that understanding AI is necessary to leverage its benefits while mitigating risks.
“We can’t just ban AI and pretend it doesn’t exist,” said Professor Karan Puri. “Instead, we should educate both students and faculty on how to use it responsibly.”
The Promise of AI in Education
Advocates of AI in education highlight its ability to personalize learning, provide instant feedback, and automate repetitive tasks. A Forbes article, “AI In The Classroom: Pros, Cons And The Role Of EdTech Companies” by Olufemi Shonubi, outlines how AI-powered tools like Knewton Alta and Carnegie Learning offer customized learning experiences by adapting to students’ strengths and weaknesses.
At QCC, some faculty members see AI as a valuable supplement to traditional instruction.
“AI can help struggling students by offering instant explanations and practice problems,” said Professor Puri. “For example, in math courses, AI-powered tutoring systems can reinforce concepts outside of class, giving students more opportunities to practice.”
Moreover, AI can alleviate some of the burdens on educators. Automated grading tools like Gradescope can save instructors hours of work, allowing them to focus on lesson planning and individualized support. Their slogan is “Deliver and Grade Your Assessments Anywhere”.
“There are only so many hours in a day,” said Dr. Saindon. “If AI can handle some of the routine grading, it frees up time for meaningful student interactions.”
Some students also recognize AI’s potential as a learning aid. A student at QCC shared, “AI helps me simplify information. I use it for creating flashcards, practice tests, and study activities.” Another student said, “I use it for formatting essays and solving math problems, but not to generate full content.”
However, others remain skeptical about AI’s effectiveness in learning. One student noted, “AI doesn’t really explain things well. It gives answers, but not always the understanding behind them.”
Challenges and Concerns: Academic Integrity & Over-Reliance on AI
Despite its benefits, AI raises pressing concerns—particularly regarding academic honesty. Some faculty worry that AI tools like ChatGPT could be used for plagiarism or to complete assignments without genuine learning taking place.
“We’re already seeing cases where students rely too much on AI,” said Professor Karan Puri. “If they’re just copying AI-generated responses without understanding the material, that’s a real problem.”
This concern aligns with QCC’s new AI policies, which emphasize department-specific guidelines for acceptable AI use. Instructors will be responsible for defining what constitutes misuse and setting clear consequences for violations.
Student perspectives on AI and academic integrity vary. One QCC student shared, “I was unsure at first about whether AI use was considered cheating. However, I believe it’s fine as long as you understand the content and use your own words.” Another student had a stricter view, stating, “I haven’t used AI for schoolwork because it’s considered plagiarism. Professors have made it clear that AI-generated work could result in serious consequences.”
Beyond integrity concerns, there is also the risk of diminished human interaction in education. AI tools, while efficient, cannot replicate the emotional support and mentorship that educators provide.
“Students don’t just learn from textbooks and lectures—they learn from human connections,” said Dr. Saindon. “If we rely too much on AI, we risk losing the personal engagement that makes education meaningful.”
Some students echo these concerns. One student said, “For me, AI creates more challenges because I learn better when hearing from a person.” Another student added, “AI can simplify concepts, but it doesn’t improve my actual writing skills.”
A Virginia Tech report “The Good, The Bad and the Scary” supports these concerns, warning that excessive AI use could lead to weaker critical thinking skills and an over-reliance on technology. The report highlights that while AI can assist with problem-solving, it does not replace the need for students to engage in deep learning. Relying too heavily on AI-generated answers may hinder students from developing their own analytical abilities, making them less prepared for complex problem-solving in real-world scenarios.
The report also raises concerns about bias in AI tools, noting that the data AI is trained on can sometimes reinforce existing inequalities. This means students who rely on AI for research may unknowingly receive biased or incomplete information, making it crucial for them to cross-check sources and verify accuracy.
“AI-generated content isn’t always reliable,” said Professor Puri. “It’s important for students to fact-check and develop their own understanding rather than just accepting AI’s answers as truth.”
The Road Ahead: Balancing Innovation and Ethics
As QCC implemented these AI policies in Fall 2024, the conversation around AI in education remains ongoing. Faculty members agree that while AI has the potential to enhance learning, it must be integrated thoughtfully and ethically.
“AI isn’t going away,” said Professor Puri. “The question isn’t whether we should use it, but how we should use it responsibly.”
With department-specific guidelines, faculty training, and administrative support, QCC aims to strike a balance—leveraging AI’s benefits while preserving academic integrity and meaningful student engagement.