Over the last several months, our investigation of AI in the classroom has been nothing short of an eye-opening journey. In the beginning, we talked a lot about our fears and bleak views of what will happen to education and other aspects of our lives as this technology moves forward. I cannot say I am any less fearful today than I was then. Perhaps I am even more troubled, but curious about what positive impacts it could also have on our day-to-day lives and work. The goal of our Cohort 21 work has been to initiate conversations with our colleagues and students, and then see how we can take what we understand now and develop a new academic integrity policy that includes the use of AI.
It is now common knowledge that there are benefits to AI, and specifically ChatGPT for teachers and students. Some of the benefits that were discussed in the most recent CIS discussion on ChatGPT include: Complex sourcing of information, tailored solutions to problems that draw from academic articles and encyclopedias, helping students really focus on the question by saving time finding the answers, checking for grammar, generating writing prompts to challenge themselves, “essay-engineering”, providing constructive feedback on accuracy, readability, and flow. It can provide suggestions on presentations and ask students questions to consider. Being conversational in nature, it can become a debate partner, summarize ideas for students, and break down assignments to help students project manage. It has been described as a ‘laser-focused’ search tool.
For teachers, it can be used for lesson planning, generating assessments, and for ongoing/instant feedback for students. It can write a rubric in KTCA format quickly, and even delegate tasks for group work projects. Teachers can submit an essay and accompanying rubric and it will quickly assess the student before a final look. In essence, it can give us the one thing we all wish we had more of…time to enjoy teaching. One big issue for teachers now is the problem of using assessments that were created without a tool like this in mind. Should we continue to use these assessments and work on preventing students from accessing platforms like ChatGPT, or should we be moving ahead and incorporating this into the classroom and our assessments. This goes back to my last post and Adam Caplan’s thought-provoking question; if we are not preparing our students for the future they will inhabit, then are we doing the best we can as educators?
In discussions with my peers at Cohort, it sounds like all schools are in the process of developing concrete steps and policy for addressing AI. Many schools are looking at adding an addendum to modify their current integrity policies. We have started this discussion at Albert College and plan to further the discussion at our Professional Development sessions in the spring, focusing on how AI can be used as an academic tool and what strategies we can implement to ensure students are using it responsibly. I think walking this journey with our students and learning together is a really unique opportunity for learning. In the Fall, we plan to conduct a survey that will bring perspective on student use of AI platforms like ChatGPT. By then, there will no doubt be more development and smarter technology, so we see a slow and steady approach to this issue as the best one right now. I encourage you to check out our presentation from our last face-to-face session.
I am grateful for my time at Cohort 21 and for the moments of reflection we got to have as a group of skillful educators from CIS schools. I am inspired by the passion and commitment shown by others working on such complex issues and projects within their schools. Many thanks to the facilitators, coaches, and leaders this year. I look forward to seeing you all again!