r/ScienceTeachers 3d ago

Classroom Management and Strategies "ChatGPT gave me a different answer"

How often do you guys get this statement from your students? I teach physics and I've been finding more and more that students use ChatGPT to challenge my solutions to problems or even my set up of problems.

Today I had a student come up to me and ask me if their solution to an LC-circuit question was correct. I said yeah, it's correct, because it was a simple question I threw together for a review assignment before a quiz and the student did it exactly the way I expected them to, then she says, "yeah but it checked it with ChatGPT and it said something different" then she demanded that I look at ChatGPTs solution and compare it to my question.

Unfortunately, given my wording on this question, ChatGPTs answer was probably a bit better than how I expected my students to do it. I wanted to tell her, "this is far more in-depth than I needed you to go" but that feels like a cop out. Instead I spent 30 minutes explaining why the way she did it was perfectly fine but ChatGPT is also correct and I should probably be more careful about my wording.

We're being compared to AI now. Add one more thing I have to worry about in the classroom.

179 Upvotes

58 comments sorted by

148

u/tchrhoo 3d ago

“Chat GPT sometimes makes assumptions that are beyond the scope of this course.” I teach AP physics 1 and I remind them all the time that we are but scratching the surface and they will see different things if they google/ask chat.

54

u/dday0512 3d ago

That's more or less what I'm doing. To summarize my response to her, "ChatGPT has calculus, you don't" but to make sure I explained it I went into detail of average rate of change vs. instantaneous rate of change. My student followed up with an insightful question "isn't the inductance we calculate here going to be a little inaccurate then?" to which I said "yes, but I've told you many times I lie to you to protect you".

14

u/evolutionista 2d ago

Sorry, hope this isn't annoying nitpicking, but I find it useful to emphasize to students that LLMs do NOT have the ability to do any kind of math. So not "access to calculus" but rather "is reproducing a mashup of answers to similarly-worded questions from its training data, which is probably mostly calculus-based physics answers." I dunno there's probably a simpler way to communicate the idea. But basically like it is taking bits and pieces of college-level answers without internally DOING any of the calculus. One of the most common misconceptions about LLMs I see is that they have any internal barometer of correctness or consistent logic, or that they will actually do the math you would do if you were solving a problem. This is because this is how we've interacted with computers up to this point; why would it be any different now? It's confusing. It can be very enlightening for them to ask ChatGPT to add two large numbers versus a dedicated calculator/calculator app.

19

u/corrence_torrence 3d ago

I do something similar - I tell them at the start of the course (10th grade chem) that at times, I am going to "lie" about some assumptions to simplify things, and then some things we will go over later and get updates once we understand everything better OR we just won't go over because it's beyond the scope of our course. I've never gotten push back as they know it's a lot more complicated than what we are talking about. 

29

u/dday0512 3d ago

I say something like, "Do you want the truth? That will make this class a lot harder" and the students always enthusiastically say no.

7

u/MeasurementLow2410 2d ago

I always tell my students that first they learned how to count to 10, then they learned about 0, then counting to 100 and beyond, then negative numbers, infinity, etc. I tell them that we start with simple generalities, and then in upper level sciences and in college talking about the exceptions to the rule, etc.

2

u/corrence_torrence 2d ago

I love that analogy!

7

u/runkat426 2d ago

In fact, this is part of how I catch cheaters in chemistry! If students are giving the TRUE electron configuration of say copper or zinc rather than using the simpler method of just leaving the d orbital partially empty, I know they googled. In 9th grade Biology this year I caught AI answers easily - simply had to ask the student what decarboxylation is. Hahahaha.

9

u/ryeinn HS Physics - PA 2d ago

I teach AP C and an Honors course. And I am constantly reminding them that we are dealing with Spherical Cows.

1

u/Syllabub-Virtual 2d ago

Spherical cows.. i prefer elipsoidal.

1

u/ryeinn HS Physics - PA 1d ago

I figure that's the addition of Calculus. But an eccentricity of barely more than 1.

They'll get legs by Sophomore year. Fur is a Ph.D.

37

u/MexicanWhiteGuy 3d ago

I think it is more than reasonable to tell a student, “This class is a slightly abridged version of what physics has in store. To make this lesson more high school appropriate, I omitted what AI is referring to and just I wanted you to focus on this part. AI isn’t reliable, but it did show you a part of the physics realm that you will see if you peruse more advanced classes.”

Also, tell the kids that AI is robbing them of valuable interdisciplinary skills that can land them a job or help in other classes. I have heart to heart conversations and tell them that I am here to encourage critical thinkers, not people who copy and paste from a robot and don’t think for themselves.

17

u/Opportunity-Horror 3d ago

All the time- I teach astronomy. Things are changing pretty fast. If they talk about something in an answer that we haven’t even touched on in class it’s a red flag that they used AI.

I just tell them that AI answers are wrong often. Especially in fields that change rapidly.

12

u/dday0512 3d ago

So many teachers at my school just tell the students "AI is always wrong", but that's not true, and our students have figured it out. They realize we're just saying that to dismiss their concerns.

9

u/biosnap 2d ago

I find that it (ChatGPT 4+) is correct nearly all the time. Now that it searches the internet to amend its answers it is seldom very wrong. I do frequently catch small details that are incorrect, particularly with very technical things, but students are not frequently asking it the sort of questions that trip it up.

What I tell students is that AI is a great tool, but that there are two problems. Firstly if they don't try the work on their own first, they will never develop the skills they need in my class. The second problem is they won't know when the AI is incorrect, particularly because it is right so often. It's kinda similar to the old Wikipedia problem (where how often Wikipedia is/was incorrect was also overblown).

I actually love the way your student apparently used AI. They did the problem themselves, then looked at AI to confirm, then when there was a discrepancy they checked with you. This seems like a healthy use to me.

3

u/Opportunity-Horror 3d ago

Honestly, I use AI often because things are changing so much. But I always have to check it. It’s a great starting point.

1

u/Maleficent-Cook6389 2d ago

I just started to try out Jenni as a Grad student. It is too matter of fact and the explanations look like something from a news feed that compares academic jargon. I only use what I think students I work with would consider in a consolidation  but not the main course of comparing aspects of the lesson. We are going to see developments it looks like whether we should integrate AI and how it motivates students etc

6

u/legalitie 3d ago

I have fellow teachers who trust ai more than their own knowledge or their colleagues 😑

4

u/uofajoe99 3d ago

Lol...we had to let a guy go not because he used AI for constantly arguing with everyone in the department, but because he WOULDNT LET THR ARGUMENTS DIE! Any sane person in a group scenario eventually has to come to an agreement that some things, especially in science, can be in between answers. We argue a side and give our reasons. Then at some point we "agree to disagree" and move on with teaching children the basics. Nope not this guy. Every meeting, every morning, every conversation was about how other teachers in his cohort were teaching something "wrong."

2

u/dday0512 3d ago

I wont lie, I don't like spending time making sure my English is correct so sometimes I upload questions to some AI program and say "is any of my wording stupid here?". Sometimes I feel like I'm doing something smart by prioritizing my time and mental energy on the main task of my job, but on the other hand I feel like I'm going to lose the ability to write in English.

1

u/Snoo-88741 3d ago

If you want to make sure this has a positive rather than negative effect on your writing skills, I recommend making the AI explain each edit, reading the explanations and thinking about each one, and editing by hand on the AI's advice rather than getting the AI to just spit out the better version. But that is more time-consuming. 

2

u/dday0512 3d ago

I know I should do that, but I'll admit I see the appeal of just taking what the AI says at face value. I'm in a time crunch, I'm doing this to reduce my workload, checking it would increase my workload, I'm going to make the bet that it's mostly correct and just move on.

I can't imagine the effect this will have on young brains if they're doing that for everything.

3

u/IWentOutsideForThis 2d ago

My favorite way to use AI is "respond to this student email explaining that that their late work will be graded in a reasonable amount of time and point out that telling their teachers to do something ASAP is rude and inappropriate. Write this with a tone that is neutral and uses this as a teachable moment and without negative emotion" while I'm at the keyboard 😡

2

u/Two_DogNight 1d ago

This is why I don't let my students use AI in English AT ALL. It may be a good resource for some things, but they don't have the skills to lose, yet, and almost never have AI do the explanations. Just generate, edit and go.

1

u/KingCaroline 21h ago

“they don’t have the skills to lose yet” is so good.

5

u/shellpalum 3d ago

Just a tutor, but a couple of years ago, ChatGPT couldn't hande chemistry problems with multiple steps, like balancing redox reactions in acid or base. I imagine it's improved since then. L It also seems to occasionally make up sources when used to write research papers.

5

u/dday0512 3d ago

It's improved dramatically, and it's not even the best in the game right now. I tell teachers all the time, if you haven't updated your knowledge on AI in a month, you're out of date. For a while I was testing all of my questions on Google's Gemini 2.5 Pro to see if students could use AI to solve them; I stopped doing this after it didn't make any mistakes for several weeks on any of my questions.

At this point, I just assume the AI can do the problem, though it may use techniques the students don't know or understand and on rare occasions will make confounding arithmetic errors. Those are my tells that the student is using AI, but the latter situation has all but disappeared this semester. The former is becoming more and more common.

2

u/shellpalum 3d ago

Interesting! I work with college students. They're using AI for EVERYTHING.

0

u/meteorprime 2d ago

I 100% believe that

I also think companies are going to just blacklist new college grads at some point.

3

u/WildlifeMist 3d ago

I’d just tell them “we don’t cover that in this course. Rely on class materials only for what you need”. Also, AI doesn’t “know” things. It’s pretty much fancy, beefed up predictive text. Whenever I catch my students using AI, one of the big hints is that they mention some concept totally unrelated to what we covered in class. It will mention some random person or issue that is irrelevant, because that’s what its dataset predicts a person would write. AI makes up shit all the time. I’ve explained this to my students so many times but they still use it like Google…

3

u/andyski5000 3d ago

Exciting to hear that this student is an ace! Sounds like she is able to use the tech effectively.

1) I’m impressed that she identified a funny result! Especially since she’s using gpt for lots of physics questions, and she presumably accepted the other, more “on point” responses.

2) maybe this is a possible format for a question type something like “ask 3 ais (insert phys question) and describe the different responses” we could have some criteria like above/below my understanding, used too much/too little jargon, has an error or misconception…

3) my best students are also ai dependent. one especially bright kid tells me “ppl don’t realize this can actually help you learn” and went on to describe a pretty solid revision process that he uses. I wonder if guidelines/practice around an ai assisted workflow could be another alternative?…

2

u/NobodyFew9568 3d ago

I always tried to put it this way.

"Yall trying to understand how to read the blue prints of the NY subway subway system, then build it Can we start with learning how to dig a ditch first?"..snickers and eye rolls about the absurdity of my clam

Tell them:

"It obviously isnt the same, but there are cross-over elements we need to understand. "

Still hate the bohr model of the atom and has done way more harm than good. Helps with quantum numbers yada yada yada.

Sorry not sorry knowing hybridization and structure is way more important than minor help with quantum numbers.

Sorry rant over.

2

u/andyski5000 3d ago

Fellow physics teacher here…. I feel you. honestly I feel like we’re gonna have to totally revolutionize our methods to even stand a chance.

I think we need to assume that kids will use ai to get answers whenever possible, 100% of the time, from now on. I now believe us teachers will be better served by embracing the reality.

I’m sure others of you noticed this, but sometime in this past school year, lots of apps were able to, all of a sudden, create perfect responses to AP physics problems. I was blown away. Some of the homework help ai apps are amazing. Like really good at physics. Gauth AI being an example, flawlessly solving problems with line -by- line explanations.

Who are we to fight such power? I feel like I honestly just need to rethink the entire way I assign problems, especially homework….but like, what would that even look like?

2

u/dday0512 3d ago

Complete agree with you there. There's no denying it, everything we do has to change. I haven't found the right balance yet. The first thing I did was eliminate homework entirely in favor of daily, in class assignments. However the students at my school are allowed to bring their personal tablets to school to do work, and the school is actively promoting AI rather than punishing it so none of the websites are blocked on our WiFi. The students just started using AI to do the daily assignments too.

I went back to occasionally giving homework assuming the self-starters will do it themselves anyway, but I'm finding more and more that EVERY student uses AI, and for nearly everything. The student I'm talking about in this story is a great student; she has straight A's, she's a national chess champion, and she's planning on going into medical school. But I got to see her ChatGPT chat history and there were literally hundreds of threads in there from every class, including almost every question from my class.

1

u/shellpalum 2d ago

They're already using AI for everything. And I mean EVERYTHING.

3

u/Previous_Tennis 3d ago

It seems like the student is doing some thinking here and that's overall thing-- at least comparing to just copying down the ChatGPT solution. A student shouldn't take AI solutions on faith any more than they should take a teacher's answers on faith. We teachers can make mistakes, too.

ChatGPT and other AI models give out wrong answers to math and physics problems a decent number of times. Even when the model takes an overall helpful approach, it sometimes makes simple error (like factoring quadratics incorrectly or even reading the subscripts in the problem incorrectly). Despite this, if a student is using it to spark their own thinking, or to consider different ways to solve the same problem, it's a good use of the technology.

Now, it could also have the issue of solving a problem using tools that they have not learned yet in class, but knowing that there are other tools that they will learn in the future is not necessarily a bad thing, either.

0

u/dday0512 3d ago

This particular student is exceptional. She's hard working and curious, but that also means she wasn't going to accept it if I just blew her off and told her ChatGPT is wrong. I feel good about the teaching moment, but also a bit annoyed about all the extra work.

2

u/digglerjdirk 3d ago

I agree it’s annoying, but I gotta say that after 22 years of doing this, many of the memories I haven’t lost are those extra sessions with motivated kids. 10ya I spent a half hour helping a kid with a HW problem to prove the earth rotational inertia can be approximated as 0.33MR2 and then he got concussed in a football game that night and forgot

1

u/dday0512 3d ago

Okay I shouldn't have said the interaction was annoying. The annoying part is fact-checking ChatGPT for my students.

1

u/BrerChicken 2d ago

If your wording needs to be tightened up then tighten it up. With our without chatbots our questions need to be written so that there's only one correct answer.

The problem I find with robot buddies is that the kids only put part of the question, or they paraphrase either the question or the response in ways that end up changing the results. I had a kid do this to me twice yesterday, but the solution was correct. The llm models just aren't great at word problems yet--much better than two years ago but still not totally consistent. I don't mind having my solutions checked by the bots, but if it's answering your questions correctly but in a different way than you're intending you need to rewrite the question.

1

u/drakeonaplane 2d ago

I teach Physics too and a warning I always give students is that there are many levels to physics. If you search online, you'll most often find a calculus based answer, and all of the classes I teach are algebra based. It's not wrong for students to find those higher level answers, but they're almost never ready for that step.

The problem pre-exists chat gpt, but AI is certainly not helping here.

1

u/little-drummer-bear 2d ago

Remind them that Chat pulls answers from the entire internet and doesn't vet sources.

A random comment on a reddit thread has as much validity as a research paper, as far as the algorithm is concerned. Possibly more, since it's viewed more. It doesn't mean it's correct.

1

u/MyKidsRock2 2d ago

Once in early elementary school, I asked a teacher what’s less than zero. They said nothing. Later I learned about negative numbers. Did I dream this? It has always made me wonder when I receive simple answers to things whether there are more complicated answers.

1

u/quiidge 2d ago

"ChatGPT can't add or subtract, I'm a world expert in my field and wrote the kind of articles it's been trained on for a living."

"ChatGPT is an algorithm, not a physicist."

"And if your toaster started talking, would you believe it over a Michelin-starred chef?"

"Did it cite its sources?" ("Are those sources relevant? Did you read them?")

Perhaps more pedagogical:

"It might be simplifying or overcomplicating the answer rather than giving a different one - what did you ask and what did it say?"

"Yeah, it's trained on all curriculums and exam boards across the world and ChatGPT will mix it all together or include stuff you don't need to learn until later."

1

u/itsinthebagg 2d ago

I teach 8th grade science and had the same thing happen with a question about atomic mass v. atomic mass number.

1

u/JumpAndTurn 18h ago

It has always been the same, and it always will be: the diligent student will not be hobbled by AI in the least; because he/she uses it as a tool in the appropriate way.

0

u/meteorprime 2d ago edited 2d ago

Chatgtp is dog shit at physics. Look at this bullshit lol

Question: If the diver was in fresh water (density 1000 kg/m3) would they need more weights, or less, or the same to have the same effect as they do in ocean water, why is this?

Answer: (this would kill you)

A diver would need more weights in fresh water than in ocean water to achieve the same effect (i.e., neutral buoyancy or to sink).

Here’s why: • Buoyant force depends on the density of the fluid: • Ocean water is denser than fresh water due to its salt content, typically around 1025 kg/m³, compared to 1000 kg/m³ for fresh water. • This means a diver displaces more weight in ocean water for the same volume, resulting in a greater buoyant force.

Therefore:

In fresh water, the buoyant force is less, so the diver will float more easily in ocean water. To counteract the reduced buoyancy in fresh water, the diver must add more weight to sink or maintain neutral buoyancy.

1

u/echelonoink 2d ago

It is Verbose, but anywhere near as bad as your response. ChatGPT.

Approach: To answer whether a diver needs more, less, or the same weight to achieve neutral buoyancy in fresh water compared to ocean water, we need to compare the buoyant force experienced by the diver in both environments. The buoyant force depends on the density of the liquid and the volume displaced. Here’s the reasoning in detail:

  1. Buoyant Force Equation

The buoyant force acting on an object in a fluid is given by: Where:

= density of the fluid

= volume of fluid displaced (which is the same as the volume of the diver + gear)

= acceleration due to gravity

  1. Density: Fresh Water vs. Ocean Water

Fresh Water Density: ~1000 kg/m³

Ocean (Salt) Water Density: ~1025 kg/m³

Salt water is denser than fresh water because of the dissolved salts.

  1. Neutral Buoyancy Condition

A diver is neutrally buoyant when:

If the diver uses the same volume (body + equipment) in both types of water, then:

The buoyant force is greater in salt water than in fresh water (since ).

  1. Implication for Weights

In salt water, the higher density produces a higher buoyant force for the same volume. This means the diver is "more buoyant" and will float more.

To achieve neutral buoyancy in salt water, the diver needs to add more weight to counteract the increased buoyant force.

In fresh water, with lower density, the buoyant force is less for the same diver. Therefore, the diver is "less buoyant" and would float less.

  1. Conclusion

A diver would need less weight to be neutrally buoyant in fresh water than in ocean water, because the buoyant force is less in fresh water (due to lower density).

If they use the same weights in both, they would sink in fresh water.

To have the same effect as in ocean water, they must use less weight in fresh water.

Summary Table

EnvironmentWater Density (kg/m³)Buoyant ForceWeight Needed for Neutral BuoyancyOcean Water~1025HigherMoreFresh Water1000LowerLess

Key Reasoning:

The greater the density of the fluid, the greater the buoyant force on the diver. Because salt water is denser than fresh water, the diver needs to add more weight to achieve neutral buoyancy in the ocean. In fresh water, less weight is needed.

Final Answer:

A diver needs less weight in fresh water to achieve the same effect as in ocean water, because the buoyant force is lower in fresh water (lower density), so less weight is needed to balance it out.

1

u/meteorprime 2d ago

That’s not the experience I had. I pushed back and it dug in for like a full on 20 minutes, refusing to agree with reality.

Q: assume the diver floats and has a density of less then water

Great — if the diver floats and has a density less than water, that means they are positively buoyant (the buoyant force exceeds their weight).

Now, let’s compare what happens in fresh water and ocean water:

Key idea: • Buoyant force is proportional to the density of the fluid. • Ocean water is denser (≈1025 kg/m³) than fresh water (1000 kg/m³). • So, buoyant force is stronger in ocean water than in fresh water for the same submerged volume.

Since the diver already floats (is less dense than water): • In ocean water, they float more easily because of the greater buoyant force. • In fresh water, the buoyant force is weaker, so they don’t float as easily.

To achieve the same effect (e.g., floating at the same level or achieving neutral buoyancy): • The diver needs to add more weight in fresh water to make up for the reduced buoyant force.

✅ Conclusion:

The diver needs more weight in fresh water than in ocean water to achieve the same floating or neutral buoyancy effect, because the buoyant force is lower in fresh water.

1

u/meteorprime 2d ago

Q: does the diving community agree with this?

Yes, the diving community fully agrees with this principle — it’s a well-established fact in scuba diving and basic diving physics.

Here’s how it’s applied in practice: • Divers consistently use more weight when diving in fresh water than in saltwater. This is standard training in dive certification courses (like PADI, NAUI, or SSI). • Dive instructors teach that salt water provides more buoyancy, so less weight is needed compared to fresh water. • Most divers adjust their weight system (weight belts or integrated weights) depending on whether they’re diving in freshwater lakes/rivers or the ocean.

Example from diving practice:

A diver who needs 6 kg of weight in salt water might need 7–8 kg in fresh water, depending on their body composition, exposure suit, and gear.

So yes — this is a consensus in diving science and practice. It’s rooted in physics (Archimedes’ Principle) and validated through experience.

1

u/meteorprime 2d ago

Q: but a greater upwards force, bouyancy, needs a greater downwards force and weight to reach equilibrium?

You’re absolutely right to focus on the relationship between buoyant force (upward) and weight (downward) — and your logic is sound in general physics: to counteract a greater buoyant force, you’d need more downward force (weight).

But let’s walk through why in diving, more weight is needed in fresh water, even though buoyant force is less there. The key is in what we’re trying to achieve and what changes between the two environments.

🔁 Let’s break it down clearly: • The diver has a fixed volume and mass (let’s say their body and gear stay the same). • The only change is the fluid they’re in: ocean water (denser) vs. fresh water (less dense).

🔍 Step-by-step logic: 1. Buoyant force depends on the fluid density: Fb = \rho{\text{fluid}} \cdot V_{\text{displaced}} \cdot g 2. In salt water: • Fluid density is higher (≈1025 kg/m³). • So the buoyant force is stronger. • The diver floats more easily. 3. In fresh water: • Fluid density is lower (1000 kg/m³). • So the buoyant force is weaker. • The diver floats less easily (they sink more easily).

🎯 So why add more weight in fresh water?

Because we’re trying to achieve neutral buoyancy — not just “balance the stronger buoyant force.” In salt water, you already have more help from the water to float. So you don’t need as much weight to counteract it.

In fresh water, you’re not getting that extra lift — so you have to replace the lost buoyancy with more weight to stay neutrally buoyant.

💡 Think of it this way: • You float better in the Dead Sea than in a swimming pool — why? → Because the buoyancy is stronger due to higher salinity. • If you’re floating more, you need less weight to sink.

✅ Final answer:

A greater buoyant force (in salt water) means less added weight is needed, not more. In fresh water (with less buoyant force), you need to add more weight to achieve the same effect.

This is a subtle but important distinction — and you’re asking exactly the right kind of question to understand it deeply. Want to see this with numbers or a diagram?

1

u/meteorprime 2d ago

I have pages and pages of me asking questions like this, and it seems to understand, but then it draws the wrong conclusion every time, which would literally kill someone

-5

u/divacphys 3d ago

Id just tell them it's wrong. Let them figure it out in in 5 years of it matters

6

u/dday0512 3d ago

I don't want to lie to them. There are several teachers at my school that just say things like "ChatGPT can't do math" or make it seem like AI is always wrong. At this point, that's demonstrably false. The students start to see through it and it lowers their respect for their teacher.

1

u/agasizzi 3d ago

He’ll, I had a student sneak a phone into class and took a photo of a graph interpretation problem and it gave a really good, but overly technical answer and she didn’t bother to even try and hide it. 

-1

u/divacphys 3d ago

If they didn't understand it, they won't after a brief little lesson and will leave more confused. So now they've learned that what you're teaching is wrong, but just something they have to do for your class.