What makes a good question in your opinion?

  • Thread starter Thread starter Hlud
  • Start date Start date
Click For Summary

Discussion Overview

The discussion centers around the qualities that make a good question in physics education, particularly in the context of a specific example involving a pendulum and the height of a tower. Participants explore the effectiveness of such questions in assessing student understanding and engagement, considering various educational levels and pedagogical approaches.

Discussion Character

  • Debate/contested
  • Conceptual clarification
  • Meta-discussion

Main Points Raised

  • Some participants argue that the example question is poorly constructed because it lacks real-world relevance and tests mere calculation skills rather than deeper understanding.
  • Others suggest that the question can be appropriate for introductory students as it connects different physics concepts, such as pendulum length and gravitational effects.
  • A participant emphasizes that a good question should integrate new and previous knowledge, fostering synthesis rather than rote memorization.
  • Another viewpoint is that a good question should allow instructors to assess mastery of content, balancing between straightforward calculations and thought-provoking inquiries.
  • Some participants share anecdotes about alternative approaches to determining building height, highlighting creativity in problem-solving beyond standard methods.
  • Concerns are raised about the assumptions underlying the question, such as the accuracy of the pendulum's period and the relevance of the building manager's knowledge.
  • One participant notes that a good question should be clear and directly related to the material being taught, while allowing for reliance on previously learned concepts.

Areas of Agreement / Disagreement

Participants express differing opinions on the effectiveness of the example question, with no consensus reached on what constitutes a "good" question. Some see value in the question for its educational potential, while others criticize its design and relevance.

Contextual Notes

Limitations include varying interpretations of what makes a question effective, dependence on student levels, and unresolved issues regarding the assumptions made in the example question.

  • #31
Andy Resnick said:
Anyone can nit-pick any question ad infinitum: most of my students don't know how to drive a car and/or have never been on a roller coaster; I suppose that puts them at a disadvantage for those types of questions.

I apologize. Upon moving to a new country, i have had a lot of students truly struggle with some of those types of questions, and it has made me more cautious in question writing.
 
  • Like
Likes   Reactions: lucia
Science news on Phys.org
  • #32
Dr. Courtney said:
I would never make a determination regarding whether the teacher or the students are more at fault in poor exam performances without knowing how much of the reading and homework assignments the students are doing. Teachers cannot and should not shoulder the blame for poor exam performance when students who are not doing the assigned reading and homework exercises score poorly on the exams.

Well, i wouldn't want to put the entire blame on the teacher as well, but i don't think the students would suddenly jump in blame over the course of one year. There was a clear drop in performance after the change, and still has a lower achievement rate (even considering the change to scoring) now, compared to AP B years. A huge part of this drop is the quick shift in expectations.

Nonetheless, a lot of teachers i have discussed these changes with do not care for and struggle implementing qualitative responses. The textbook may use words to explain the physics. The teacher may use words to explain the physics. Why is the student restricted from using words to explain physics? You may argue this is not the case, but student work suggests otherwise. You may see a picture; it would be rare to see a written explanation of the physical insight to solving the problem; either way it becomes all math from there. You rarely see a written explanation after a numerical solution.

What i am asking for is what qualities of a question would better generate these types of responses. Ones where emphasis is placed on physical insight. The question in the OP has, in my opinion, low physical insight, and hence why i don't consider it to be a good question.

And, by no means am i suggesting the abolishment of math in physics (i say this because a lot of teachers and professors believe that conceptual physics is physics without math). I am arguing for the inclusion of more qualitative explanations in addition to that math.
 
  • #33
Hlud said:
Well, i wouldn't want to put the entire blame on the teacher as well, but i don't think the students would suddenly jump in blame over the course of one year. There was a clear drop in performance after the change, and still has a lower achievement rate (even considering the change to scoring) now, compared to AP B years. A huge part of this drop is the quick shift in expectations.

Nonetheless, a lot of teachers i have discussed these changes with do not care for and struggle implementing qualitative responses. The textbook may use words to explain the physics. The teacher may use words to explain the physics. Why is the student restricted from using words to explain physics? You may argue this is not the case, but student work suggests otherwise. You may see a picture; it would be rare to see a written explanation of the physical insight to solving the problem; either way it becomes all math from there. You rarely see a written explanation after a numerical solution.

What i am asking for is what qualities of a question would better generate these types of responses. Ones where emphasis is placed on physical insight. The question in the OP has, in my opinion, low physical insight, and hence why i don't consider it to be a good question.

And, by no means am i suggesting the abolishment of math in physics (i say this because a lot of teachers and professors believe that conceptual physics is physics without math). I am arguing for the inclusion of more qualitative explanations in addition to that math.

It seems like you are wanting to solve with questions issues that can be addressed with a grading rubric.

Both me and several departments I know awarded most points on physics problems based on factors other than the mathematical solution.

20% was for drawing and labeling a picture or diagram
20% was for identifying the important physical principle (Conservation of Energy, for example, or Newton's 2nd law)
20% was for writing down an orderly sequence of steps to solve the problem
20% was for the numerical solution
20% was for a written assessment for whether and why the numerical solution was correct

This grading rubric awards 80% of the points for stuff on the paper other than math.
 
  • #34
Dr. Courtney said:
It seems like you are wanting to solve with questions issues that can be addressed with a grading rubric.

Both me and several departments I know awarded most points on physics problems based on factors other than the mathematical solution.

20% was for drawing and labeling a picture or diagram
20% was for identifying the important physical principle (Conservation of Energy, for example, or Newton's 2nd law)
20% was for writing down an orderly sequence of steps to solve the problem
20% was for the numerical solution
20% was for a written assessment for whether and why the numerical solution was correct

This grading rubric awards 80% of the points for stuff on the paper other than math.
Most of those "20%" items ARE the Mathematics; but this is a matter of interpretation.
 
  • #35
Dr. Courtney said:
It seems like you are wanting to solve with questions issues that can be addressed with a grading rubric.

Both me and several departments I know awarded most points on physics problems based on factors other than the mathematical solution.

20% was for drawing and labeling a picture or diagram
20% was for identifying the important physical principle (Conservation of Energy, for example, or Newton's 2nd law)
20% was for writing down an orderly sequence of steps to solve the problem
20% was for the numerical solution
20% was for a written assessment for whether and why the numerical solution was correct

This grading rubric awards 80% of the points for stuff on the paper other than math.

I have used a similar rubric for grading in the past. However, i did not include the last step, as you have it. How would you model the written assessment of why the numerical solution is correct?

I have abandoned grading this way because i am trying to expand on the student's giving me sufficient physical insight. I think using Newton's 2nd Law does not satisfy that.
 
  • #36
I think having students take limiting cases could be one way for a student to rationalize why there solution is correct/reasonable. As for numerical answers: order of magnitude, comparison to real world scenarios, etc. are other ways of rationalizing that the numerical answer makes sense. I took a class in undergrad where the instructor had us make an explanation for whether our answer was reasonable or not. From personal experience it is very difficult to guide students into developing this intuition. However, it's not an easy thing to develop in the first place either.
 
  • #37
Hlud said:
I apologize. Upon moving to a new country, i have had a lot of students truly struggle with some of those types of questions, and it has made me more cautious in question writing.

Being cautious is good- it means you are being thoughtful! Slightly off-topic, but when my students ask for study tips, I often suggest they invent 'test-like' questions because of the mental effort involved. Those that do often remark how effective that strategy is.
 
  • #38
Hlud said:
I have used a similar rubric for grading in the past. However, i did not include the last step, as you have it. How would you model the written assessment of why the numerical solution is correct?

That depends on the topic at hand. I usually make significant efforts to teach topic-appropriate assessment methods throughout the course, so that students have ample instruction and lots of practice. But in general, I emphasize that good assessments have three components: a double check on magnitude of the number, the direction (or sign), and the units. Units tend to be similar across topics. A student should do the math on the units when they substitute numbers and units of quantities into their final symbolic expression. An assessment on units can be as simple as "the units of the answer are as expected for an acceleration, m/s/s."

If the answer is a vector, it might be a good assessment of direction to say, "the direction of the acceleration is the same as that of the net force." Or if it is a scalar, "It makes sense that the final velocity is negative, because the ball is falling at the end, and the positive direction was defined to be upward."

Assessing the numerical magnitude tends to be more specific to the topic. But when working with Atwood machines and objects sliding or rolling down inclined planes with gravity as the only external force, I point out that the magnitude of right answers is always between 0 and 9.8 m/s/s. Numbers above 9.8 m/s/s in these kinds of problems need a lot of extra scrutiny and are probably wrong.
 

Similar threads

  • · Replies 76 ·
3
Replies
76
Views
7K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 41 ·
2
Replies
41
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K