Now Reading
Can ChatGPT Play a Productive Role in Science Education?

Can ChatGPT Play a Productive Role in Science Education?

Hong Kong: It has been nearly a year since ChatGPT exploded into the public consciousness, sparking in equal measure excitement and anxiety about the rise of artificial intelligence (AI). 

This is precisely why for most people, AI refers only to large language models like ChatGPT. But even before such models were accessible to the public, AI was ubiquitous in modern lives. Personalised social media feeds, automated spellcheckers and digital voice assistants like Siri are all powered by AI.

Those alarmed by the rise of ChatGPT feared that AI would eat into jobs or be misused to spread disinformation. Proponents of AI branded such people as Luddites

Since then, opinions have not become any less polarised. This became evident at a recent panel discussion at the Hong Kong Laureate Forum, hosted by the Shaw Foundation. During a talk on the role of AI in science education, some panellists were deeply sceptical that AI indeed had a role at all, while others said it would democratise science.

Despite the polarised reactions, one thing remains common between the two camps: Nobody really knows where AI is headed, even in the near future. Our best bet at prediction, it appears, is through imperfect analogies.

Professor Andrew G. Cohen of the Hong Kong University of Science and Technology was of the opinion that ChatGPT would be as disruptive as an electronic calculator. When calculators first became available, he said, many were concerned that it would discourage students from performing arithmetic and mathematical functions. In the long run, calculators would negatively impact cognitive and problem-solving skills, it was believed.

While this prediction has partially come true, Cohen says the benefits of calculators far outweigh the drawbacks. With menial calculations out of the way, students had the opportunity to engage with more complex mathematical concepts. What calculators did for arithmetic, large language models can do for language, the professor argued. The interactive nature of AI will allow students to engage with science in a way they have not been able to so far, he says. 

Cohen accepts that ChatGPT is sometimes prone to producing ‘garbage’. But it is for educators to find effective ways to use the tool. He said as an example, educators can ask language models to behave like an 18-year-old science student. This could give them some insights into the kind of questions that a student might ask or which parts of the lecture may be difficult to grasp. For students, ChatGPT removes judgment, Cohen says. Those students who hesitate to pose questions to their teachers may instead prefer to ask ChatGPT. 

Astrophysicist Victoria M. Kaspi agreed with Cohen’s view, saying AI’s computational skills have made processing big data sets easier. Used for the appropriate purposes, large language models can drive scientific understanding forward, she said.

Even the sceptics have their own imperfect analogies. Professor Reinhard Genzel, the 2020 Physics Nobel winner, compared learning science to learning music. Once a musician spends thousands of hours learning to play an instrument, they ‘feel’ the music rather than ‘play’ the music, he said. Similarly, someone who has spent hours doing menial calculations and problems will find it easier to ‘feel’ science. 

Professor Roger D. Blandford of Standford University felt that the experience of ‘online education’ during the COVID-19 pandemic showed that a human instructor was indispensable to the process of learning. 

He added that while ChatGPT allows students to bypass solving some problems, they also lose the ability to learn the system. Solving an integral is not about learning the formula, but learning to adapt, expand and organisation, he said. In these matters, a “good book that is thought through and has a logical structure” works much better than AI or ChatGPT, Blandford argued. 

Of course, another imperfect analogy that has often been deployed is to compare the rise of AI with the threat of nuclear holocaust at the peak of the Cold War. But as one character in the recent biopic of J. Robert Oppenheimer, the ‘father of the atomic bomb’, says, the genie cannot be put back in the bottle. ChatGPT and other generative models are here to stay. It is for scientists to decide how it will be used. 

Scroll To Top