AI disruption in higher education
Artificial intelligence can’t outsource critical thinking but is it possible to co-exist?
AI transforms the way we teach, learn, review and conduct research. AI promises to revolutionise research in higher education through writing assistants, literature review, data analysis and research recommendations. However, it creates an insurmountable challenge to critical thinking, research integrity, and research rigour.
The dramatic rise of AI writing assistant platforms, such as ChatGPT, Google Bard, Jasper, Copy.ai, Writer offer the ease and convenience of quick writing but seriously compromise the role of human output and critical thinking. Since these platforms tend to generate made-up information, limited reasoning abilities or biased outputs, critical thinking is pivotal to enhancing thinking horizon, identifying incorrect answers and improving the quality of output. As educators, the onus is on us to communicate the message to our classrooms that these machines have not been designed for creative thinking, ingenuity or problem solving skills. They can’t exercise judgements, identify falsified datasets, and measure societal impact. No matter how good the prompt is and how nuanced the outputs are.
Professor Shahriar Akter says AI tools should not be used to cut corners in the rigorous process of research.
In higher education, it has become a constant challenge to differentiate between original works and AI-generated content. Platforms like Keenious and Scite have become increasingly popular with students as they can recommend relevant papers, summarise articles, assess the reliability of studies and justify the context of citations. However, these tailored services and their skewed knowledge have increasingly challenged the integrity of research. This is a watershed moment in the history of our higher education that requires urgent attention to the strict principles of research integrity in reporting and assessing AI generated outputs.
The integration of AI in academic discovery also calls for establishing rigour in the research process. Students at all levels need to be trained in the ethical and responsible use of AI, quality of the data and methods, and reproducibility of the findings. The traditional research values that have taught us how to carry out research methods scrupulously and meticulously need to be monitored to evaluate the integrity and competence of AI-driven research process. Although AI tools help in achieving research efficiency, they should not be used to cut corners in the rigorous process of research.
Students at all levels need to be trained in the ethical and responsible use of AI. Photo: Marvin Meyer, Unsplash
AI can’t outsource critical thinking, research integrity or research rigour. As AI evolves and becomes more sophisticated, we must learn how to co-create and co-exist with AI. It is instrumental to deal with every innovative AI platform with skepticism and curiosity. Embracing our humanistic core, we must prepare our next generation to identify logical inconsistencies, skewed knowledge, bias, discriminations, and recommendations that go against our ethical standards as humans.
is a Professor of Digital Marketing Analytics & Innovation and Associate Dean of Research in the Faculty of Business and Law.
UOW academics exercise academic freedom by providing expert commentary, opinion and analysis on a range of ongoing social issues and current affairs. This expert commentary reflects the views of those individual academics and does not necessarily reflect the views or policy positions of the 51²è¹Ý.