Mastering Uncertainty Calculations: A Guide for Lab Courses and Beyond

In summary, the conversation is about the speaker's desire to understand the concept of computing uncertainties in lab courses. They mention using the Pythagorean theorem for this process and ask for more information on the topic. The other person explains the general formula for calculating uncertainties and provides a simple example. They also mention that this formula can be applied to more variables.
  • #1
fayan77
84
0
Hello, took a year off school and starting school soon. So I'm shaking off the rust and remembered in my previous lab courses we were supposed to compute uncertainties but I never really understood how that works. All I remember was its something to do with pythagorean theorem. I want to really understand this mathematical concept for I will be taking many more labs at university level and want to be prepared.
 
Engineering news on Phys.org
  • #2
What you are referring to I believe is e.g. if ## V=V(X,Y) ## where ## X ## has uncertainty ## \Delta X ##, and ## Y ## has uncertainty ## \Delta Y ##, then the uncertainty ## \Delta V ## satisfies ## (\Delta V)^2=(\frac{\partial{V}}{\partial{X}})^2 (\Delta X)^2+(\frac{\partial{V}}{\partial{Y}})^2 (\Delta Y)^2 ##. Perhaps someone can find a link to something that spells this idea out in more detail. ## \\ ## A simple example of the above is the case where ## V=X+Y ##. Then ## (\Delta V)^2=(\Delta X )^2+(\Delta Y )^2 ##. ## \\ ## Meanwhile, the above can be generalized to more variables, e.g. ## V=V(X,Y,Z,...) ##.
 
Last edited:

1. What is an uncertainty in computing?

An uncertainty in computing refers to the potential variation or inaccuracy in the results of a computation due to various factors such as measurement errors, rounding errors, or limitations in the computational model. It is a measure of the confidence or reliability in the computed value.

2. How is uncertainty quantified in computing?

Uncertainty in computing is typically quantified using statistical methods such as standard deviation, confidence intervals, or probability distributions. These methods help to estimate the range of possible values for a computed result and provide a measure of the confidence in that result.

3. What are the sources of uncertainties in computing?

The sources of uncertainties in computing can vary depending on the specific computation being performed. Some common sources include measurement errors, limitations in the computational model, data input errors, or insufficient precision in numerical calculations. External factors such as hardware failures or environmental conditions can also contribute to uncertainties.

4. How can uncertainties be reduced in computing?

To reduce uncertainties in computing, it is important to understand and identify the potential sources of error and take steps to minimize their impact. This can include improving measurement techniques, using more precise computational models, increasing the precision of numerical calculations, or implementing error correction methods.

5. Why is it important to consider uncertainties in computing?

Considering uncertainties in computing is crucial because it helps to provide a more accurate and reliable understanding of the results. It also allows for a better assessment of the potential risks and limitations associated with a computation, which is important for making informed decisions based on the computed results.

Similar threads

Replies
6
Views
958
Replies
3
Views
776
Replies
5
Views
1K
  • STEM Academic Advising
Replies
24
Views
2K
Replies
15
Views
2K
Replies
4
Views
844
Replies
4
Views
776
  • STEM Career Guidance
Replies
5
Views
851
Replies
6
Views
1K
Back
Top