I heard a story recently from my sister about the challenges facing schools and students with ChatGPT.
My sister has a friend who prefers to write her essays in Word before copying them into the online portal for submission. Let’s call the friend “Sarah”. Sarah follows a routine: copy the assignment into Word, write her essay in Word, and then paste the answer back into the online portal.
Recently, Sarah pasted an assignment into Word and found it peculiar: the assignment asked her to use the word “banana” at least ten times in her essay. She complied (although she thought it was odd), and then her teacher failed her, claiming ChatGPT had written the essay!
What happened? It turns out that teachers are employing clever tricks to catch students using AI for their essays. In this case, the online assignment text changes when copied outside the browser. So by copying it out of the portal and into Word, Sarah inadvertently worked on a modified version intended for ChatGPT.
The teacher “knew” that any essay using the word “banana” 10 times would be AI-generated, because that requirement would only show up if the assignment had been copied and pasted.
Do you believe Sarah?
Does the teacher believe her?
Yesterday, I spoke with a UW communications professor who required her students to use ChatGPT to draft, re-draft, and improve their assignments. She wanted the students to turn in the exchange they had with ChatGPT as well as the improved final product to show they truly engaged with the tools.
So some teachers are also making ChatGPT an integral part of the learning experience for their students.
But in both cases, the teacher wants proof of work from the student. Either they want proof that the student used the new tools, or proof that they didn’t. Perhaps the new standard is going to be that people will be using Loom to screen-record their work, and then they can produce the loom video if someone questions the authenticity of the work.
Sometimes it’s going to matter if an AI tool produced something (like when something is introduced as evidence in court), and sometimes it won’t (did I write this post, or did ChatGPT?). But more and more, we’re going to want to see proof of work so we can decide if it matters.
Interesting insights here. I took an online exam recently and they had a tortured process to ensure I was showing my work and not using external tools, like AI, to find the answers. Their solution was to have three online proctors for each test taker. I shared my screen and then used two cameras, my laptop camera and my phone camera, to show my face and the room. The room could have nothing, not even furniture. Because that is difficult to do, the test proctor recommends that the test is taken in the bathroom because the room was smaller and was easier for the proctors to see the room from edge to edge. Before AI, cheating would still take time and allow the three administrators to detect it because you had to look up the answer or do a Google search. With AI, they fear that I can cut and paste the question on the screen into a prompt and get an instant full and smart answer.
About incorporating technology into education, I remember having a heated debate with my third grade teacher, Mr. Henning, about using a calculator. They had recently become much cheaper so I argued we should be using them. We all had them at home and some rich kids had them on their wrists. My final point was that my dad carried one in his pocket protector everyday, so why shouldn't we use them? I lost that argument. But, in fourth grade, my teacher embraced these pocket wizards and I never did long division in my head again. Thank you Ms. Doxey! Our kids need to be trained to use AI, machine learning, coding, automation, systems thinking and kindness. Of course we need to learn skills, like writing, because that allows you to uses the technology. Those who are given many opportunities to learn with technology will succeed. Those who don't will be left behind. I hope we can accept and adapt so more people are included.