Share
Why do we use percent deviation rather than simply expressing the size of the error itself?
Question
Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Answers ( )
Answer:
the percentage we have the fraction between the error and the magnitude,
Explanation:
When absolute error is used, it is a value that remains constant regardless of the magnitude of the measurement, for large measurements it can represent great accuracy, but for small measurements the accuracy may be low, for example> if we use a tape measure with error of Dx = 0.1 cm if the measurement is 1 meter this error is acceptable, but if the measurement is 1 cm the error is too big.
When we use the percentage we have the fraction between the error and the magnitude, so if this relationship is small the measure is very tight and but if the relationship is high the measure has little pressure, for example
Δx = 1 cm = 0.01 m
% = 0.01 / 1 100 = 1%
% = 0.01 / 0.01 100 = 100%
therefore the precision of the measurement is known the percentage error