
Despite its convenience, educators are finding that AI is hindering the development of essential human skills in students
by FARAH SOLHI
HIGHER education institutions are observing a growing trend among students: Assignments that require critical thinking are increasingly being answered using artificial intelligence (AI)-powered chatbots. Since 2022, developers have introduced text-generating tools such as OpenAI’s ChatGPT, Google’s Gemini and Microsoft Copilot, aimed at making life easier.
These tools can perform tasks like home automation and personal assistance with just a few typed prompts — essentially enabling conversations with virtual robots.
However, despite their convenience, educators are finding that AI is hindering the development of essential human skills in students.
For Tunku Abdul Rahman University lecturer Haizren Mohd Esa, the appearance of specific patterns, the use of pointers or repetitive words such as “delve,” “additionally” or “key descriptive” in her students’ essays are often the telltale of AI usage.
“These are common markers of ChatGPT-generated content. Unlike Turnitin, which is used to detect plagiarism, ChatGPT does not have a clear author, making it difficult to determine whether the content reflects the students’ own critical thinking or was generated by AI.
“ChatGPT provides students with complete answers to their questions and can generate content on virtually any topic. In contrast, tools like Grammarly only help identify grammatical errors and do not produce full responses to assignments,” she told The Malaysian Reserve (TMR).

I often convert written assignments into presentations or question-and-answer sessions, says Haizren (Pic courtesy of Haizren Mohd Esa)
With Turnitin, she could detect plagiarism from online sources, and if the similarity index exceeds 20%, students may face mark deductions.
However, it cannot detect the use of ChatGPT or other AI tools and students are aware of this, which presents a challenge for educators.
With some universities lacking the tools or apps to detect AI-generated content, Haizren said lecturers can only remind students and trust them not to use such tools in their assignments.
For now, she said, this is the best educators can do, as it would be difficult to completely prevent the use of ChatGPT and similar tools.
Alternatively, lecturers could find their own ways around these tools by designing creative and innovative assignments to test students’ knowledge without the use of AI.
“For example, I often convert written assignments into presentations or question-and-answer sessions. This allows me to assess their knowledge directly, rather than relying on written work that may have been copied from AI tools. Practical assessments are also an effective way to evaluate students, as these cannot be completed using AI.
“That is just the way forward now. We must adapt our teaching methods and assessments to address AI usage. Since it is widely used, we need to find a way to manage and tolerate students’ use of these tools in a controlled manner by educating and encouraging them to not rely entirely on AI-generated content.”
If universities allow students to rely on text-generated chatbots for written assignments or to find solutions to practical issues, they risk undermining students’ critical thinking skills and jeopardising their personal development in the future, Haizren added.

Fatimah says increasing use of AI tools among university students emphasises the need for ethics (Pic courtesy of Fatimah Tajuddin)
For UOW Malaysia Communication and Creative Arts lecturer Dr Fatimah Tajuddin, the growing use of AI tools among university students highlights the need for one thing: Ethics.
This emphasis came after she personally experienced researching for a study paper to cross-check the references cited in written assignments submitted by her students, only to be disappointed when she discovered that those study papers did not exist.
“It was very unethical for me. Initially, I did not like students using AI tools to do their assignments and at some point, I was even appalled by some academics who are using ChatGPT to conduct their research.
“However, it is the norm these days, so it is up to me to understand the new reality that AI is being used everywhere and by many types of people,” she told TMR.
Fatimah now advocates for the ethical use of AI, where users need to be aware of how much they are relying on AI as an assistant to provide structure or starting points, and then elaborate on the subject matter on their own.
“Using 100% of AI would be like
replacing your brain, which I do not like at all. We, as humans, still need to use our brain to structure our points in our own words.”
If it were up to her, she would put a stop to AI usage among her students altogether.
“But maybe that is impossible, so I have grown okay with students using AI. But then again, they must remember that AI is just to assist us, not to replace us,” Fatimah warned.

Text-generating AI tools won’t produce meaningful responses without prompts that include solid ideas to begin with (pic: Bloomberg)
Why Students Rely on AI Tools?
A university student, who wants to be known as Siti from Perak, said she regularly turns to AI tools in her studies, describing them as practical aids that simplify her academic workload and help her keep up with technology.
The 24-year-old engineering student said she often turns to ChatGPT to complete her essays, as it helps her to expand her ideas and explore new perspectives on various topics.
“I did not turn to such tools out of fear that I may not write well and eventually fail, but rather to help me better understand my essay topics.
“As an engineering student, it is important to have someone or something to bounce ideas off of. With ChatGPT and other text-generating tools, it feels like having a collaborative, shareable space to refine my thoughts and solve prob- lems efficiently,” she told TMR.
Siti added that while AI tools help her develop her own thoughts, they do not replace the critical thinking process. She explained that these tools provide a basic foundation of ideas, but the responsibility for analysis and decision-making still lies with the user.
She noted that text-generating AI tools won’t produce meaningful responses without prompts that include solid ideas to begin with.
“You need to know what you want to learn or explore in order to solve problems or expand your ideas,” she added.
Meanwhile, Rahman, a student from a university in Kedah, said some AI tools are helpful in boosting his productivity.
He believes that those who know how to use AI tools effectively will benefit the most, as the impact of AI depends entirely on how it is used — contrary to the belief that AI will eventually replace human jobs.
“Using AI tools doesn’t guarantee a solution to the problem at hand. They help me explore ideas and occasionally offer useful insights, but not to the extent that I rely entirely on them,” he said.
“With the rise of the Internet, universities expect students to spend time researching existing studies to solve more advanced problems. This can be overwhelming, so AI tools like ChatGPT help guide our thinking and approach,” he added.
The second-year mechatronic engineering student added that AI can also provide ideas and suggestions in response to inquiries, and help users form logical hypotheses beyond academic contexts.

According to Murali, the key lies in how the usage of AI tools is embedded into educational process (pic: TMR)
How Can Students Work Around AI?
Asia Pacific University (APU) deputy vice-chancellor Prof Dr Murali Raman told TMR that the use of AI tools among students, as raised by many lecturers, is not alarming, as is impossible to avoid a technology that is becoming an integral part of daily life.
However, there are guidelines in place to help lecturers set boundaries for students regarding the use of AI in their assignments.
“Among the key points in these guidelines is the emphasis on academic integrity, clearly outlining what is acceptable and what is not in terms of AI use in academic work.
“Students must be educated about the ethical implications and potential consequences of improper AI usage. Some higher education institutions also employ AI-detection software to identify assignments that may have been completed using these tools,” he said.
Murali said APU enforces academic policies and regulations, which may include disciplinary actions for violations and non-compliance.
While AI tools can be valuable resources for generating ideas and enhancing writing, Murali said a holistic approach is still required when it comes to assignment design.
He emphasised the importance of educating students about the ethical use of AI and clearly communicating guidelines on what constitutes acceptable use in academic work.
“We also believe that assignments should be designed to encourage critical thinking and originality, making it less feasible to rely solely on AI-generated content. These are often referred to as ‘authentic assessments’,” he added.
By incorporating these aspects, APU can ensure that AI is used responsibly and constructively in academic settings.
Higher learning institutions cannot shy away from the use of AI in assignments, as doing so would be akin to asking someone to stop using the Internet. One way to address this, he reiterated, is to come up with a holistic approach that combines clear policies, regulations, and awareness programmes.
“Additionally, the use of relevant detection tools and, more importantly, the incorporation of authentic assessments are essential. This comprehensive strategy ensures that AI is used responsibly and constructively in academic settings,” he added.
It is a valid concern to view some cases of students using AI tools to complete their assignment as “taking the easy way out,” Murali said.
However, he said, the key lies in how the usage of AI tools are embedded into the educational process.
Murali noted that if used ethically, AI can be a powerful tool to enhance learning rather than a crutch. Assignments and research should be more focussed on solving real world problems or be reflective in nature, and in the case of smaller cohorts, be driven by active class participation and engagement.
“Education as a whole should focus on character development and producing professionals who are competent and industry-ready.
“These days, industries too are leveraging AI on many fronts. It is therefore imperative that we train our students on how to maximise the value potential of AI, while ensuring its use remains responsible and ethical,” he said.
- This article first appeared in The Malaysian Reserve weekly print edition
The post AI in higher education: Tool or trap? appeared first on The Malaysian Reserve.