Had I had too much on my plate at work, you could be reading an AI generated article. When I requested ChatGPT write a 1200 word article on the pros and cons of using AI in the classroom, its response was formulaic, dispassionate and not particularly engaging – but it did the job.

In writing this, I tried to fox ChatGPT. But it happily wrote me a short story including a croissant, a zebra, and a kaleidoscope, and confidently discussed the ethics of the fictional croissant serving other croissants as snacks. It agreed that fiction can be an effective vehicle for moral discussion, and that magic was a good allegory for AI. It answered the question ‘how are you finding school?’ from perspectives as varied as Lucy Pevensie, Neville Longbottom, and Arthur Dent. It gave an excellent critique of utilitarianism. It seemed no question was too awkward, despite my most convoluted efforts.

Eventually, I simply asked: ‘When was the clock invented?’

ChatGPT claimed: ‘the first mechanical clock is usually credited to a European monk named Domescus, who is said to have built a clock in the 8th century AD.’ However, a quick Google search reveals there is no such person. Asked the same question repeatedly, ChatGPT claimed the mechanical clock was invented in the early 13th century, and the 14th century, and that the earliest recorded mechanical clock is still at Salisbury Cathedral. While the cathedral does contain the oldest working mechanical clock, it wasn’t the first.

While fascinating, ChatGPT isn’t perfect – and could clearly lead any one of us down a rabbit hole of inaccuracy if we didn’t have our wits about us. So what does this technology mean for the future of education?

ChatGPT in the teaching and learning context

There are many potential benefits to ChatGPT in education; for example, it could save overworked and underpaid teachers hours by speeding up everyday administrative tasks, and even lesson planning.

It could also have a role as a form of assistive technology for students grappling with obstacles such as dyslexia and ADHD, which could in turn help increase the accessibility of education; particularly in higher education, students with such additional obstacles are often less likely to succeed due to inadequate support.

However, on the other side of the same coin some experts have expressed concern about such students becoming dependent on technology like ChatGPT, which encourages them to effectively outsource solutions rather than think for themselves or develop other ways of coping with their disabilities. Additionally we know that, despite their vast knowledge base, AI chatbots are fallible. Examples like mine above are commonplace; Google’s answer to ChatGPT, Bard, offered a wrong answer just in its demonstration, claiming that the James Webb telescope took the very first picture of an exoplanet.

On top of this fallibility, as much as ChatGPT’s ability to gather huge amounts of information at speed makes it an excellent research tool, some experts are concerned that it will become the most effective tool for spreading disinformation ever. ChatGPT has even been found to fabricate highly plausible academic references — so it may not only be a matter of teachers spotting work written by AI, but of them equipping students with the tools to effectively spot inaccurate content while conducting their research.

But for all its factual inaccuracy and moral indifference, ChatGPT has been shown to create passing grade answers even at university level — what are the implications of this for students’ learning, and what does this mean for our understanding of fair evaluation?

The social implications of AI in education

Research conducted by BCS, The Chartered Institute for IT, with its Computing at School (CAS) network of teachers found that 62% of teachers said chatbots like ChatGPT will make it harder to mark students’ work fairly.

The potential implications of this go deeper than who gets the gold star in class. The advent of ChatGPT-like services and the inevitable addition of ‘premium’ versions could serve to widen the existing socioeconomic disparity in schools. Affluent parents paying for access to higher quality AI services only adds to the laundry list of better resources and opportunities afforded to economically advantaged children, making it even easier for them to gain an upper hand over their peers. While this may not seem hugely important from assignment to assignment, in the big picture this could further entrench the existing chasm in access to the best schools and universities, and higher education as a whole.

Given this, it is concerning that the majority (56%) of the 124 computing teachers surveyed did not think their school had a plan to manage incoming use of ChatGPT by pupils. While 33% said early discussions had taken place, just 11% said a plan was being formed. Additionally, over three-quarters (78%) rated the general awareness of the capabilities of ChatGPT among colleagues at their school or college as ‘low’ or ‘very low’. Ironically, one of the biggest problems facing the successful integration of new technology into the education sector is a lack of education.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

Fortunately, resources for teachers on ChatGPT are growing; for example, educator Evan Dunne created a comprehensive guide for teachers on how to regulate and interact with the uses of ChatGPT in the school environment. The guide provides everything from an introduction to using the technology and its capabilities to advice on potential issues such as spotting plagiarism in students’ work, as well as offering guidance on how to use it effectively as a planning, workflow management and collaborative learning tool. Accessible resources like this are a key part of the roadmap towards successfully integrating this new technology into the education sector as a positive rather than a negative development.


The advent of publicly accessible technologies like ChatGPT has many potential positive implications in the education sector, from assisting students with learning disabilities to saving teachers time on administrative tasks. However, as well as questions about technology’s role in widening socioeconomic disparities, issues of dependency, accuracy and academic integrity inevitably arise from its use – and BCS’ research suggests that teachers are unprepared for how to handle such technology in the classroom.

Though resources such as Dunne’s guide are becoming more available, there is a long way to go in building an effective approach to the role of ChatGPT in a teaching and learning context, and this incredible leap forward in AI and the role it plays in our lives will doubtless be hugely impactful.

Clearly, the role these new technologies might play in education and how teachers choose to interact with them will require careful consideration. The true consequences of the integration of AI into educational contexts will only really become apparent over time, and as ever with new technology we will have to adapt alongside its growth; the balance between its negative and positive potential will only become clear through trial and error. All we can say for certain right now is that though ChatGPT contains much that is apocryphal, or at least wildly inaccurate, it scores over older, more pedestrian technology in two important respects: it is new, and it is exciting.

All testing was done with ChatGPT V3.