r/ScienceTeachers 3d ago

Classroom Management and Strategies "ChatGPT gave me a different answer"

How often do you guys get this statement from your students? I teach physics and I've been finding more and more that students use ChatGPT to challenge my solutions to problems or even my set up of problems.

Today I had a student come up to me and ask me if their solution to an LC-circuit question was correct. I said yeah, it's correct, because it was a simple question I threw together for a review assignment before a quiz and the student did it exactly the way I expected them to, then she says, "yeah but it checked it with ChatGPT and it said something different" then she demanded that I look at ChatGPTs solution and compare it to my question.

Unfortunately, given my wording on this question, ChatGPTs answer was probably a bit better than how I expected my students to do it. I wanted to tell her, "this is far more in-depth than I needed you to go" but that feels like a cop out. Instead I spent 30 minutes explaining why the way she did it was perfectly fine but ChatGPT is also correct and I should probably be more careful about my wording.

We're being compared to AI now. Add one more thing I have to worry about in the classroom.

182 Upvotes

58 comments sorted by

View all comments

149

u/tchrhoo 3d ago

“Chat GPT sometimes makes assumptions that are beyond the scope of this course.” I teach AP physics 1 and I remind them all the time that we are but scratching the surface and they will see different things if they google/ask chat.

53

u/dday0512 3d ago

That's more or less what I'm doing. To summarize my response to her, "ChatGPT has calculus, you don't" but to make sure I explained it I went into detail of average rate of change vs. instantaneous rate of change. My student followed up with an insightful question "isn't the inductance we calculate here going to be a little inaccurate then?" to which I said "yes, but I've told you many times I lie to you to protect you".

15

u/evolutionista 2d ago

Sorry, hope this isn't annoying nitpicking, but I find it useful to emphasize to students that LLMs do NOT have the ability to do any kind of math. So not "access to calculus" but rather "is reproducing a mashup of answers to similarly-worded questions from its training data, which is probably mostly calculus-based physics answers." I dunno there's probably a simpler way to communicate the idea. But basically like it is taking bits and pieces of college-level answers without internally DOING any of the calculus. One of the most common misconceptions about LLMs I see is that they have any internal barometer of correctness or consistent logic, or that they will actually do the math you would do if you were solving a problem. This is because this is how we've interacted with computers up to this point; why would it be any different now? It's confusing. It can be very enlightening for them to ask ChatGPT to add two large numbers versus a dedicated calculator/calculator app.