It can definitely be a good tool for studying or for organizing your thoughts but it’s also easily abused. School is there to teach you how to take in and analyze information and chat AIs can basically do that for you (whether or not their analysis is correct is another story). I’ve heard a lot of people compare it to the advent of the calculator but I think that’s wrong. A calculator spits out an objective truth and will always say the same thing. Chat GPT can take your input and add analysis and context in a way that circumvents the point of the assignment which is to figure out what you personally learned.
Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class. A good education system would instruct students in how to structure their prompts in a way that helps them learn the material - because the LLMs can construct virtually limitless examples and analogies and write in any kind of style, you can tailor them to each student with the correct prompts and get a level of engagement equal to a private tutor for every student.
So the act of using the tool to generate an assignment response could, if done correctly and with guidance, be more educational than anything the student picked up in class - but if its not monitored, if students don’t use the tool the right way, it is just going to be seen as a shortcut for answers. The education system needs to move quickly to adapt to the new tech but I don’t have a lot of hope - some individual teachers will do great as they always have, others will be shitty, and the education departments will lag behind a decade or two as usual.
Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class.
That’s if the LLM is right. If you don’t know the material, you have no idea if what it’s spitting out is correct or not. That’s especially dangerous once you get to undergrad level when learning about more specialized subjects. Also, how can reading a paper be more informative than doing research and reading relevant sources? The paper is just the summary of the research.
and get a level of engagement equal to a private tutor for every student.
Eh. Even assuming it’s always 100% correct, there’s so much more value to talking to a knowledgeable human being about the subject. There’s so much more nuance to in person conversations than speaking with an AI.
Look, again, I do think that LLMs can be great resources and should be taken advantage of. Where we disagree is that I think the point of the assignment is to gain the skills to do research, analysis, and generally think critically about the material. You seem to think that the goal is to hand something in.
I’ve been in photography classes where Photoshop wasn’t allowed, although it was pretty easily enforced because we were required to use school provided film cameras. Half the semester was 35mm film, and the other half was 3x5 graphic press cameras where we were allowed to do some editing - providing we could do the edits while developing our own film and prints in the lab. It was a great way to learn the fundamentals and learning to take better pictures in the first place. There were plenty of other classes where Photoshop was allowed, but sometimes restricting which tools can be used, can help push us to be better.
Depends on how it’s used of course. Using it to help brainstorm phrasing is very useful. Asking it to write a paper and then editing and turning it in is no different than regular plagiarism imo. Bans will apply to the latter case and the former case should be undetectable.
GPT is a tool that the students will have access to their entire professional lives. It should be treated as such and worked into the curriculum.
Forbidding it would be like saying you can’t use Photoshop in a photography class.
It can definitely be a good tool for studying or for organizing your thoughts but it’s also easily abused. School is there to teach you how to take in and analyze information and chat AIs can basically do that for you (whether or not their analysis is correct is another story). I’ve heard a lot of people compare it to the advent of the calculator but I think that’s wrong. A calculator spits out an objective truth and will always say the same thing. Chat GPT can take your input and add analysis and context in a way that circumvents the point of the assignment which is to figure out what you personally learned.
This is such a great analysis.
Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class. A good education system would instruct students in how to structure their prompts in a way that helps them learn the material - because the LLMs can construct virtually limitless examples and analogies and write in any kind of style, you can tailor them to each student with the correct prompts and get a level of engagement equal to a private tutor for every student.
So the act of using the tool to generate an assignment response could, if done correctly and with guidance, be more educational than anything the student picked up in class - but if its not monitored, if students don’t use the tool the right way, it is just going to be seen as a shortcut for answers. The education system needs to move quickly to adapt to the new tech but I don’t have a lot of hope - some individual teachers will do great as they always have, others will be shitty, and the education departments will lag behind a decade or two as usual.
That’s if the LLM is right. If you don’t know the material, you have no idea if what it’s spitting out is correct or not. That’s especially dangerous once you get to undergrad level when learning about more specialized subjects. Also, how can reading a paper be more informative than doing research and reading relevant sources? The paper is just the summary of the research.
Eh. Even assuming it’s always 100% correct, there’s so much more value to talking to a knowledgeable human being about the subject. There’s so much more nuance to in person conversations than speaking with an AI.
Look, again, I do think that LLMs can be great resources and should be taken advantage of. Where we disagree is that I think the point of the assignment is to gain the skills to do research, analysis, and generally think critically about the material. You seem to think that the goal is to hand something in.
I’ve been in photography classes where Photoshop wasn’t allowed, although it was pretty easily enforced because we were required to use school provided film cameras. Half the semester was 35mm film, and the other half was 3x5 graphic press cameras where we were allowed to do some editing - providing we could do the edits while developing our own film and prints in the lab. It was a great way to learn the fundamentals and learning to take better pictures in the first place. There were plenty of other classes where Photoshop was allowed, but sometimes restricting which tools can be used, can help push us to be better.
Depends on how it’s used of course. Using it to help brainstorm phrasing is very useful. Asking it to write a paper and then editing and turning it in is no different than regular plagiarism imo. Bans will apply to the latter case and the former case should be undetectable.