I remember watching a Youtube interview with a highly intelligent and observant entrepreneur, who cheerfully predicted that the time would come when AI programmes would replace teachers, rendering their jobs obsolete. The commentator in question was an enthusiastic advocate of personal and economic freedom and a vocal critic of the excessive incursions of State agencies in our personal lives. Yet for some reason, he seemed relatively unconcerned at the prospect of machines teaching our children.
Of course, there are tasks that most would happily relegate to AI programmes to the benefit of humanity, such as certain forms of tedious clerical work, a large chunk of manual labour, and the synthesis of unwieldy amounts of data. However, there are other tasks that cannot be delegated to a machine without endangering invaluable dimensions of our lives as human beings.
One of those tasks is teaching and learning, through which people learn to think, interpret the world, make rational arguments, assess evidence, make rational and holistic choices, and reflect on the meaning of their lives. For better or for worse, teachers, from kindergarten right up to university level, form the minds of the next generation. The formation of the mind relies on apprenticeship, imitation of a worthy model, and intellectual practice and training.
Much as an athlete fine-tunes his motor skills and muscle memory playing sport, and finds inspiration in an exemplary athlete, the student fine-tunes his mental skills thinking, reflecting, studying, analysing, and generating ideas and arguments, in dialogue with an inspiring teacher. There is both an interpersonal and “hands-on” dimension to human learning, both of which are indispensable.
Yet Artificial Intelligence is reaching the point where it has the capacity to automate and mechanise certain aspects of teaching and learning, marginalising crucial aspects of the learning process, most notably the way a teacher can model intellectual activity for the student, and the intellectual tasks a teacher assigns to students in order to fine-tune their mental skills and imagination. Many tasks which, just a few years ago, had to be undertaken “manually,” by which I mean, through the laborious activity, imagination, and effort of a human being, can now be performed automatically by AI.
When I wrote papers for my university degrees, I had to wade through texts, synthesise their content, and build an argument from scratch, using my own mind. Now, AI technology is tantalisingly close to being able to create a research paper from scratch, with a few prompts and sources provided by the user.
The end product, for example, a paper or reflection churned out by AI, may look very similar, or even largely identical, to the product of a non-AI-led writing process. But this “product” is generated largely by providing AI with the right prompts, not by working the creative and analytic muscles of the mind, or doing the mental “heavy-lifting” that is required in order to drill into a problem or take one’s intelligence or imagination to the next level.
This makes traditional teaching tools, such as the graded take-home paper, largely obsolete, because realistically, in a competitive environment, many students will not deprive themselves of the advantages of AI in the creation of graded work.
Even if a teacher encouraged or required students to write a paper without the assistance of AI, there is no reliable way to police such a requirement outside of the classroom, and it seems unfair for conscientious students to be outperformed by students of a more “pragmatic” bent who “milk” AI for all it’s worth.
This means that the whole teaching and learning process, including the evaluation of student work, will have to be re-conceived for a cohort of students increasingly comfortable using AI technologies. If teachers truly believe in the importance of a learning process that stretches and trains the intellectual abilities of the student and is not usurped at every turn by AI “shortcuts,” then they - we - will have to find new approaches to student assignments and evaluation.
These might include a greater emphasis on oral assessment, a shift to longer supervised technology-free exams, or un-graded writing assignments in which students might be more willing to forego the competitive advantage of AI if persuaded of the value of rising to an intellectual challenge.
There is a lot of concern expressed, understandably, about the prospects of mass unemployment as many tasks currently assigned to human beings get relegated to AI programmes. But we should not forget that one of the greatest risks of AI technology may be a degradation of the learning process itself, and thus a new intellectual dark age. It is up to teachers and teaching institutions to do all they can to avert such a catastrophic outcome.
This is another warning of the dangers of AI taking over too many human functions in society. Yes, it is important to model proper behaviors and intellectual activities if we want our students to become more than AI prompters. Original thinking and synthesizing is especially difficult for people who do not practice them, it precludes any possibility of new things that only human creativity can discover. AI is nothing more, at this point, than a huge statistical language management database that has been trained to associate one word with the next. It will be an economic disaster if entrepreneurs are not taught how to be original and synthesize facts together to make their processes economically valuable to the consumers. Do you think AI can form this type of individual?