Sorky's thoughts on AI
The school asked me to take a survey about AI usage! I could win a $25 gift card (probably to Amazon). In retrospect, I should've complained about that as well! Anyways, here are my answers. I didn't include the question prompts, but you can probably make some educated guesses as to what they asked.
***
I think it is unhelpful and will actively cheat our students of real learning and understanding by making them over-reliant on copying from a resource they don't fully understand, and can't identify if it is making errors or expressing poor judgement. I also think school use of generative AI is incompatible with our school goals of sustainability and fighting climate change, given how devestating it is towards the environment.
AI confidently introduces information that may or may not be correct. If I am an expert in the material, this means I waste my time proofreading instead of just doing it correctly myself the first time. If I am not an expert in the material, I am not qualified to identify when the AI is hallucinating and may repeat those confident-but-incorrect answers, causing harm to my students and my own understanding.
I mean, students since time immorial have tried to cheat themselves out of real understanding. As a teacher, my job is to actually assess them as individuals to find what they know. I worry that AI will weaken their skills, but it in no way scares me, because I am confident in my abilities as a teacher to come up with useful curriculum that will genuinely guide my students towards the information and metacognitive skills I want them to be able to express.
It can not be used thusly. We are not providing a massive data set that can be filtered using AI (such as the beneficial use of identifying anomolous cells to search for cancer), we are providing individual attention for individual people. AI collapses into an extreme form of "one size fits all" that does not have any nuance in the different needs of different students.
AIs are routinely confident-but-incorrect. No one should be using a learning tool that provides outputs they can't understand, because then they can't assess whether the answer makes sense or is correct.
Honestly, I'm somewhat horrified that this survey is providing questions acting like AI can be a helpful thing for a district that is otherwise trying to focus on authentic community-building and relationships. If SPS wants students, parents, and educators to abandon the actual communication and relationships we are forming with each other in favour of speaking to robots, I am appalled.
***
The subtext (of this, my first post in nine days) is that the district is exhausting me in so many ways right now, and I'm pretty burnt out. I'd like to make an actual post about what's happening in my life, but that involves an energy that I simply have not had. Maybe I can do that with my prep period today, instead of grading or whatever else it was I was supposed to do. I really miss reading dreamwidth and hearing what yinz are up to. :(
~Sor
MOOP!
***
I think it is unhelpful and will actively cheat our students of real learning and understanding by making them over-reliant on copying from a resource they don't fully understand, and can't identify if it is making errors or expressing poor judgement. I also think school use of generative AI is incompatible with our school goals of sustainability and fighting climate change, given how devestating it is towards the environment.
AI confidently introduces information that may or may not be correct. If I am an expert in the material, this means I waste my time proofreading instead of just doing it correctly myself the first time. If I am not an expert in the material, I am not qualified to identify when the AI is hallucinating and may repeat those confident-but-incorrect answers, causing harm to my students and my own understanding.
I mean, students since time immorial have tried to cheat themselves out of real understanding. As a teacher, my job is to actually assess them as individuals to find what they know. I worry that AI will weaken their skills, but it in no way scares me, because I am confident in my abilities as a teacher to come up with useful curriculum that will genuinely guide my students towards the information and metacognitive skills I want them to be able to express.
It can not be used thusly. We are not providing a massive data set that can be filtered using AI (such as the beneficial use of identifying anomolous cells to search for cancer), we are providing individual attention for individual people. AI collapses into an extreme form of "one size fits all" that does not have any nuance in the different needs of different students.
AIs are routinely confident-but-incorrect. No one should be using a learning tool that provides outputs they can't understand, because then they can't assess whether the answer makes sense or is correct.
Honestly, I'm somewhat horrified that this survey is providing questions acting like AI can be a helpful thing for a district that is otherwise trying to focus on authentic community-building and relationships. If SPS wants students, parents, and educators to abandon the actual communication and relationships we are forming with each other in favour of speaking to robots, I am appalled.
***
The subtext (of this, my first post in nine days) is that the district is exhausting me in so many ways right now, and I'm pretty burnt out. I'd like to make an actual post about what's happening in my life, but that involves an energy that I simply have not had. Maybe I can do that with my prep period today, instead of grading or whatever else it was I was supposed to do. I really miss reading dreamwidth and hearing what yinz are up to. :(
~Sor
MOOP!