Why do we use percent deviation rather than simply expressing the size of the error itself?

Question

Why do we use percent deviation rather than simply expressing the size of the error itself?

in progress 0
Thiên Hương 3 years 2021-07-14T06:27:39+00:00 1 Answers 23 views 0

Answers ( )

    0
    2021-07-14T06:29:11+00:00

    Answer:

    the percentage we have the fraction between the error and the magnitude,

    Explanation:

    When absolute error is used, it is a value that remains constant regardless of the magnitude of the measurement, for large measurements it can represent great accuracy, but for small measurements the accuracy may be low, for example> if we use a tape measure with error of Dx = 0.1 cm if the measurement is 1 meter this error is acceptable, but if the measurement is 1 cm the error is too big.

    When we use the percentage we have the fraction between the error and the magnitude, so if this relationship is small the measure is very tight and but if the relationship is high the measure has little pressure, for example

               Δx = 1 cm = 0.01 m

              % = 0.01 / 1   100 = 1%

              % = 0.01 / 0.01 100 = 100%

    therefore the precision of the measurement is known the percentage error

Leave an answer

Browse

Giải phương trình 1 ẩn: x + 2 - 2(x + 1) = -x . Hỏi x = ? ( )