Midweek meanderings #2
AI policy thoughts, persuading colleagues to use AI, examples of AI use, a blast from the past
Greetings!
This issue is mainly about the nuts and bolts of getting schools and teachers to adopt AI.
AI Policy thoughts
I keep coming across articles and research about schools’ AI policies — or the lack of them. It seems to me that we’ve been here before, with policies about teachers’ and departments’ use of technology, and e-safety. There is a familiar pattern:
Someone realises that everything is a bit ad hoc or completely absent, and expresses the need for someone to draft a policy about it.
That “someone” is either the head of computing, ed tech co-ordinator or similar, or the IT technician.
Alternatively, a template policy is downloaded from the internet, and the school puts its name in the appropriate slot.
The policy is then either distributed to all members of staff, or announced in a school bulletin.
It is then filed in the Principal’s Office, the IT technician’s office, or numerous waste paper baskets.
Job done, another box is ticked.
Yes, I am cynical, but tell me I’m wrong. In one of my jobs as Head of Computing, the Ofsted person assigned to me asked me what I thought of the school’s Equal Opportunities Policy. I replied that unfortunately I hadn’t had a chance to read it since it appeared in my pigeon-hole the previous day. He said nothing, but a wry smile flickered across his face. He knew what was going on, he could see right through the bullshit. Why would I wish to be seen to be party to this rubbish?
When I was working as an ICT advisor, the government of the day announced that schools which didn’t have an e-safety policy would be denied technology-designated funding from the government. Many of the schools in the district I worked in had such a policy; several didn’t.
My boss: We have a pro-forma e-safety policy. If you get those schools without a policy to put their name in it and sign it, we can approve the funding.
Me: Well, surely we should withhold the funding until they actually do something themselves?
My boss: Are you going to be the person explaining to a group of ferocious headteachers why they are not getting the funding?
The next year I was working for the Qualifications and Curriculum Authority, and we had a multi-agency meeting about e-safety in schools. A young man from the department of education thumped his fist on the table and declared: We need to make sure that only those schools with an e-safety policy in place gets the funding. His older colleague said: And who is going to do that? There are 30,000 schools in the country, and only you and me in the office.
It was at that point I realised that all the announcements and initiatives emanating from the department for education was all smoke and mirrors: there was nothing behind the curtain! Now, maybe it’s all changed now, but I would need some convincing.
Back to the issue of schools’ AI policies. Imposing one from above never really works in my experience. Setting up a committee ends up as a talking shop where nothing ends up being done, or is like an elephant’s giving birth: it’s done at a high level, with a lot of noise, and takes two years to see results.
I’m inclined to the view that what senior leadership teams should do is have what I call a very thin policy, or baseline, and then allow each teacher and area in the school to build on that as they wish.
For instance, you might stipulate that if AI is used in the production of a scheme of work, say, that fact should be stated somewhere. That would set a good example I think. The policy document might also state that AI shouldn’t be used to produce entire documents which are then passed off as the teacher’s own work.
There is a more fundamental issue I think: are teachers actually using AI, and if not, why not?
But so what? Well, I think Andrew Ng, the co-founder of Google Brain, was probably correct when he said, “AI won’t replace people, but maybe people that use AI will replace people that don’t.” That would apply to teachers too, in my opinion.
To be implented and to mean anything, any school policy must:
meet genuine needs;
be easy to implement.
Meeting genuine needs
By “genuine” I mean real, not doing something in order to tick a box or satisfy some artificial requirement that benefits nobody. When it became feasible to have computers in classrooms there were some headteachers who would walk around the school in order to check whether the computers were on, as if they thought kids would learn stuff through a process of osmosis. Same thing happened when classrooms started to acquire interactive whiteboards. A speaker from Ofsted went even further: he told the assembled group of advisors on a training day that if computers weren’t available then the teacher should just mention them. I asked the then head of ICT at Ofsted if that was indeed the official Ofsted stance, and he shook his head and said “They have had so much training. No, it is absolutely not Ofsted’s official policy.”
So what would count as a genuine need? Something that will make the teacher’s job easier, more efficient or more effective. That means that teachers need to know about what they could do with AI, and I don’t mean giving them a long list of ideas. If something looks overwhelming, it will be ignored.
I’ve included below a few things I’ve tried out, which may give you some ideas for whetting teachers’ appetites.
Be easy to implement
I don’t believe that giving people a long list of prompts, each of which is half a page long, is useful. It’s too complicated. Teachers are intelligent people. If a prompt doesn’t give them quite what they want they can refine it through an iterative process. Some AI programmes, like Google’s NotbookLM and ChatGPT, make that process dead easy by suggesting further avenues of exploration — in effect prompts —-themselves.
There are other things you can do to help teachers implement AI if they can see some potential benefits of doing so, but as this newsletter is starting to look like War and Peace I will come back to this another time.
Two examples of AI-assisted analysis
Example 1: Spreadsheet analysis
I uploaded an assessment spreadsheet to Claude.AI and prompted it as follows:
Analyse the attached spreadsheet, taking into account the number of homeworks each student has handed in, and my comments on their work. Which students are falling behind and need further support? Which ones are excelling? What is the overall level of success of this course?
Here’s what it answered:
Here’s the picture across the class.
The excelling students are Bogdan (100%), Adaeze (91%), and Isabella (82%). All three submit consistently throughout the course, and your comments on their work are enthusiastic — praise for inventiveness, wordplay, structural sophistication, and memorable individual lines. Bogdan in particular receives the greatest volume of detailed feedback and seems to have thrived on the variety of prompts.
Solid but uneven is the middle tier: Callum and Idris (both around 64%), Katarzyna (64%), and Aleksei (55%). These students write well when they do submit — your comments are warmly positive across the board — but they have noticeable gaps, often in the middle third of the course. Aleksei is worth watching: he was a returning student, very enthusiastic at enrolment, strong early on, then drops off sharply after session 5 with no apparent explanation.
Students needing active support are Eleanor, Diego, Siobhan, Omar, and Layla — all between 28–46%. The pattern here is telling: almost all of them have good comments when they do submit, suggesting the issue is engagement rather than ability. Eleanor stops entirely after session 5 (and ADHD is noted at enrolment — the prose challenge you set may have been a moment worth revisiting). Layla has an unusual pattern: nothing for six sessions, then three in a row, then nothing. Omar was active for the first three sessions and then went completely silent — worth checking in, given his autism disclosure and the potential for social anxiety around feedback.
High risk are James, Mateo, Rafael (all 0%), Fatima, and Valentina (both 10%). James is the most urgent case: he submitted nothing, received no comments, and explicitly flagged pre-existing anxiety about online learning at enrolment, including a prior withdrawal. He may not have made it past the first session at all. Mateo is similarly a blank — no submissions, no comments, despite genuine prior experience. Rafael is a returning student who submitted nothing this time, which is a marked change. Valentina mentioned a transatlantic time difference and only submitted once; it may simply be that the live sessions weren’t viable for her schedule.
Overall course health: the mean completion rate across the class is around 53%, but that figure is dragged down heavily by the five students who submitted very little or nothing. Among the twelve students who submitted at least one piece of homework, your comments are consistently warm, specific, and encouraging — there’s no quality problem with the writing you’re actually seeing. The issue is primarily drop-off and non-engagement, particularly in the second half of the course. The cohort that is engaged appears to be thriving; the concern is the roughly 35% of students who may have drifted away silently.
Please note: all the names and data were fictitious.
Now, I could see all that for myself, but the big difference was that Claude did it in about 30 seconds. When I was Head of Computing I created a spreadsheet full of exquisite formulae and wonderful macros that would provide me with pretty decent feedback and alerts for the entire Year group — 300 students. But it took me time to set the spreeadsheet up, and took time to write up the big picture summaries. Claude would have done it in a couple of minutes.
Example 2: Blog post research
I was recently approached by a college to write a blog post about what they offered. They provided me with data and several documents, and answered lots of questions through email. I put the whole lot into NotebookLM, and that enabled me to get a really good overview very quickly — again, just a matter of minutes. I then had enough at my fingertips to focus on the writing. The alternative would have been to flip backwards and forwards through disparate documents without losing the plot. Basically, the AI helped me see the wood for the trees.
Example 3: Music analysis
As an experiment, I uploaded a music score to NotebookLM, and asked it to analyse it for me. This is just a part of what it came up with:
The provided document is a musical score written specifically for the alto saxophone, featuring a piece titled “Can’t Dance“ by the renowned jazz musician Joshua Redman. It details a complex harmonic structure through various chord symbols, including dominant seventh and minor seventh variations that guide the performer’s improvisation. The notation uses articulation marks and specific rhythmic cues to indicate a sophisticated swing or funk feel. Structurally, the composition is organised into distinct alphabetical sections that outline the progression of the melody and solo portions. Furthermore, the presence of frequent time signature changes highlights the rhythmic intricacy and technical demand of this contemporary jazz work.
Based on the provided musical excerpts for the alto saxophone, the overall structure and form of Joshua Redman’s “Can’t Dance” can be described through its clearly defined sections, shifting meters, and harmonic cycles.
I am hoping that will help me when I try to tackle the tune myself, because I was away from the lesson when they covered it.
A photo that has nothing to do with technology, except that it was taken with a camera
Finally…
I’ve uploaded to the Digital Education Supplement area the issue of Computers in Classrooms (the predecessor of Digital Education) which I published in April 2006 . It has some interesting bits and pieces, including a discussion that I think is very much still apposite: the possible unintended consequences of implementing new technology.
I hope you have found this newsletter interesting and useful. Thanks for reading.




This article raises some interesting points about the difficulties in implementing AI in schools. I particularly like the suggestion to create simple, flexible policies that meet teachers’ actual needs. The examples of how AI can assist with tasks like analyzing student data and helping with research are great.
The problem with frameworks is they provide guidelines but no application, examples or lesson plans.