Hey {{ first_name | human }},

This week we have some more research on ChatGPT use for learning and how a new unintended consequence, AI detection anxiety, is being felt across higher education.

TL;DR: The 60 Second briefing

⚡️Sovereign AI: The UK announces £500 million to fund British AI.

🧪The Trust Gap: 75% of students face "AI detection anxiety".

🧪The Crutch: 57.5% of participants in a randomised controlled trial scored significantly lower on a surprise retention test.

🚨Copilot Cowork: AI now lives inside your shared Word docs and Whiteboards as a real-time collaborator.

📚 AI+education news

🧪The Trust Gap > What it is: The 2026 Higher Education Student Wellbeing Report reveals 75% of students are stressed about "false positive" cheating flags. This is significantly higher for international students (81%).

  • Why this matters: Schools are regularly sold ‘AI detectors’ to catch its use. However, all studies have found that these detectors are very unreliable. To the point that, if challenged by those accused, I would suspect that those challenges would succeed.

  • Do this next: For assignments, you may want to use an AI Assessment Scale. These scales are explicit in the extent that AI can be used.

🧪The Crutch > What it is: A new randomised controlled trial (n=120) found that students using ChatGPT as a study aid scored significantly lower (57.5%) on a surprise retention test 45 days later compared to those using traditional methods (68.5%).

  • Why this matters: It confirms the "Cognitive Offloading" effect. When the AI does the heavy lifting of summarising or explaining, the student’s brain skips the effortful processing (encoding) needed for durable memory. In short: Easier to learn = Quicker to forget.

Would you let some form of AI give you feedback on your teaching?

Login or Subscribe to participate

🌍 Wider AI updates

⚡️Sovereign AI > What it is: A new website for Sovereign AI describes it as “the UK’s £500 million fund for artificial intelligence” and says it is “funded by the UK Government”. The site frames the initiative as part of a wider national story about British innovation, referencing Ada Lovelace, Alan Turing, the World Wide Web, and AlphaFold, alongside claims that the UK has “the largest AI sector in Europe”.

🚨Copilot Co-work > What it is: A new Microsoft 365 AI experience designed to handle multi-step work across tools like Outlook, Teams, Excel and files with users approving key actions along the way. Microsoft presents Cowork as more than a chatbot. It can help manage calendars, prepare meeting materials, research topics using workplace and web data, and create assets such as decks and analysis. It is currently in Research Preview, with wider access via Microsoft’s Frontier programme later in March.

  • Why it matters: This signals a shift from AI that helps with tasks to AI that helps coordinate workflows. For schools and trusts, the real test will be whether it reduces admin without increasing checking, oversight, risk and breaking the bank.

 🎯Prompt/Tip: Cold recall

If you are aware that pupils are using AI for tasks set outside the classroom, encourage them to spend the first 10 minutes performing "Cold Recall."

Students should use pen and paper (or a blank document with WiFi off) to write down everything they already know, remember, or can brainstorm about the topic. Only after this 10-minute struggle are they allowed to use AI to compare, critique, or expand on their own "first-draft thinking."

You should then follow this up with some questions based on the task as part of a lesson to see if pupils have learnt anything.

‘Till next week.

Mr A 🦾

Help a colleague save time by sharing this newsletter; distributing these ideas helps a friend get home on time and keeps our energy focused on what matters most: great teaching.

Safety & Privacy Notice

The tools and workflows mentioned are intended for professional productivity and educational enhancement. Users must ensure that any AI implementation remains compliant with their local data protection regulations and institutional safeguarding policies.

  • Data Privacy: Do not enter personally identifiable information (PII), sensitive student records, or confidential institutional data into public AI models.

  • Verification Required: AI-generated content can be inaccurate, biased, or out of date. Always maintain a "human-in-the-loop" approach by reviewing and fact-checking all outputs before use.

  • Professional Judgement: These suggestions do not substitute for formal legal, clinical, or safeguarding advice. Final responsibility for accuracy and appropriateness remains with the professional user.

Reply

Avatar

or to participate

Keep Reading